Test Report: Hyperkit_macOS 18943

                    
                      a95fbdf9550db8c431fa5a4c330192118acd2cbf:2024-08-31:36027
                    
                

Test fail (24/220)

x
+
TestOffline (195.21s)

                                                
                                                
=== RUN   TestOffline
=== PAUSE TestOffline

                                                
                                                

                                                
                                                
=== CONT  TestOffline
aab_offline_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 start -p offline-docker-207000 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=hyperkit 
aab_offline_test.go:55: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p offline-docker-207000 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=hyperkit : exit status 80 (3m9.821525853s)

                                                
                                                
-- stdout --
	* [offline-docker-207000] minikube v1.33.1 on Darwin 14.6.1
	  - MINIKUBE_LOCATION=18943
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting "offline-docker-207000" primary control-plane node in "offline-docker-207000" cluster
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	* Deleting "offline-docker-207000" in hyperkit ...
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 16:14:24.902223    5912 out.go:345] Setting OutFile to fd 1 ...
	I0831 16:14:24.902411    5912 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 16:14:24.902417    5912 out.go:358] Setting ErrFile to fd 2...
	I0831 16:14:24.902421    5912 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 16:14:24.902602    5912 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 16:14:24.904389    5912 out.go:352] Setting JSON to false
	I0831 16:14:24.930802    5912 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":4435,"bootTime":1725141629,"procs":433,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.6.1","kernelVersion":"23.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0831 16:14:24.930912    5912 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0831 16:14:24.995919    5912 out.go:177] * [offline-docker-207000] minikube v1.33.1 on Darwin 14.6.1
	I0831 16:14:25.044098    5912 notify.go:220] Checking for updates...
	I0831 16:14:25.078017    5912 out.go:177]   - MINIKUBE_LOCATION=18943
	I0831 16:14:25.099086    5912 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 16:14:25.120048    5912 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0831 16:14:25.147056    5912 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0831 16:14:25.166955    5912 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 16:14:25.187985    5912 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0831 16:14:25.210266    5912 driver.go:392] Setting default libvirt URI to qemu:///system
	I0831 16:14:25.239184    5912 out.go:177] * Using the hyperkit driver based on user configuration
	I0831 16:14:25.281010    5912 start.go:297] selected driver: hyperkit
	I0831 16:14:25.281027    5912 start.go:901] validating driver "hyperkit" against <nil>
	I0831 16:14:25.281044    5912 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0831 16:14:25.284205    5912 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 16:14:25.284314    5912 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18943-957/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0831 16:14:25.292650    5912 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0831 16:14:25.296481    5912 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:14:25.296504    5912 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0831 16:14:25.296541    5912 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0831 16:14:25.296759    5912 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 16:14:25.296822    5912 cni.go:84] Creating CNI manager for ""
	I0831 16:14:25.296840    5912 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0831 16:14:25.296846    5912 start_flags.go:319] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0831 16:14:25.296918    5912 start.go:340] cluster config:
	{Name:offline-docker-207000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 ClusterName:offline-docker-207000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.loca
l ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: S
SHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 16:14:25.296996    5912 iso.go:125] acquiring lock: {Name:mk6e91575b208577856769ef01f8e000bc57c787 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 16:14:25.366167    5912 out.go:177] * Starting "offline-docker-207000" primary control-plane node in "offline-docker-207000" cluster
	I0831 16:14:25.386927    5912 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 16:14:25.387053    5912 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4
	I0831 16:14:25.387086    5912 cache.go:56] Caching tarball of preloaded images
	I0831 16:14:25.387309    5912 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 16:14:25.387331    5912 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 16:14:25.387821    5912 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/offline-docker-207000/config.json ...
	I0831 16:14:25.387871    5912 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/offline-docker-207000/config.json: {Name:mk4e00906ab0c27642428d48ff9b26190d41eeaa Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 16:14:25.388547    5912 start.go:360] acquireMachinesLock for offline-docker-207000: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 16:14:25.388682    5912 start.go:364] duration metric: took 103.126µs to acquireMachinesLock for "offline-docker-207000"
	I0831 16:14:25.388724    5912 start.go:93] Provisioning new machine with config: &{Name:offline-docker-207000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesC
onfig:{KubernetesVersion:v1.31.0 ClusterName:offline-docker-207000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions
:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 16:14:25.388825    5912 start.go:125] createHost starting for "" (driver="hyperkit")
	I0831 16:14:25.409968    5912 out.go:235] * Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	I0831 16:14:25.410114    5912 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:14:25.410151    5912 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:14:25.419173    5912 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53638
	I0831 16:14:25.419534    5912 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:14:25.419956    5912 main.go:141] libmachine: Using API Version  1
	I0831 16:14:25.419968    5912 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:14:25.420228    5912 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:14:25.420357    5912 main.go:141] libmachine: (offline-docker-207000) Calling .GetMachineName
	I0831 16:14:25.420453    5912 main.go:141] libmachine: (offline-docker-207000) Calling .DriverName
	I0831 16:14:25.420597    5912 start.go:159] libmachine.API.Create for "offline-docker-207000" (driver="hyperkit")
	I0831 16:14:25.420621    5912 client.go:168] LocalClient.Create starting
	I0831 16:14:25.420656    5912 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem
	I0831 16:14:25.420713    5912 main.go:141] libmachine: Decoding PEM data...
	I0831 16:14:25.420732    5912 main.go:141] libmachine: Parsing certificate...
	I0831 16:14:25.420818    5912 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem
	I0831 16:14:25.420856    5912 main.go:141] libmachine: Decoding PEM data...
	I0831 16:14:25.420868    5912 main.go:141] libmachine: Parsing certificate...
	I0831 16:14:25.420880    5912 main.go:141] libmachine: Running pre-create checks...
	I0831 16:14:25.420889    5912 main.go:141] libmachine: (offline-docker-207000) Calling .PreCreateCheck
	I0831 16:14:25.420998    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:14:25.421197    5912 main.go:141] libmachine: (offline-docker-207000) Calling .GetConfigRaw
	I0831 16:14:25.431137    5912 main.go:141] libmachine: Creating machine...
	I0831 16:14:25.431150    5912 main.go:141] libmachine: (offline-docker-207000) Calling .Create
	I0831 16:14:25.431299    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:14:25.431416    5912 main.go:141] libmachine: (offline-docker-207000) DBG | I0831 16:14:25.431277    5933 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 16:14:25.431502    5912 main.go:141] libmachine: (offline-docker-207000) Downloading /Users/jenkins/minikube-integration/18943-957/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso...
	I0831 16:14:25.909490    5912 main.go:141] libmachine: (offline-docker-207000) DBG | I0831 16:14:25.909396    5933 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/id_rsa...
	I0831 16:14:26.069483    5912 main.go:141] libmachine: (offline-docker-207000) DBG | I0831 16:14:26.069400    5933 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/offline-docker-207000.rawdisk...
	I0831 16:14:26.069502    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Writing magic tar header
	I0831 16:14:26.069511    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Writing SSH key tar header
	I0831 16:14:26.069792    5912 main.go:141] libmachine: (offline-docker-207000) DBG | I0831 16:14:26.069763    5933 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000 ...
	I0831 16:14:26.520966    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:14:26.520987    5912 main.go:141] libmachine: (offline-docker-207000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/hyperkit.pid
	I0831 16:14:26.521032    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Using UUID 484b0656-c353-44cc-9bce-06ff8f256b19
	I0831 16:14:26.685048    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Generated MAC ce:d1:31:a1:33:1f
	I0831 16:14:26.685066    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=offline-docker-207000
	I0831 16:14:26.685104    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:26 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"484b0656-c353-44cc-9bce-06ff8f256b19", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000122330)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"
", process:(*os.Process)(nil)}
	I0831 16:14:26.685135    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:26 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"484b0656-c353-44cc-9bce-06ff8f256b19", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000122330)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"
", process:(*os.Process)(nil)}
	I0831 16:14:26.685185    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:26 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/hyperkit.pid", "-c", "2", "-m", "2048M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "484b0656-c353-44cc-9bce-06ff8f256b19", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/offline-docker-207000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/bzimage,/Users
/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=offline-docker-207000"}
	I0831 16:14:26.685234    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:26 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/hyperkit.pid -c 2 -m 2048M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 484b0656-c353-44cc-9bce-06ff8f256b19 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/offline-docker-207000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/off
line-docker-207000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=offline-docker-207000"
	I0831 16:14:26.685247    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:26 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 16:14:26.688387    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:26 DEBUG: hyperkit: Pid is 5958
	I0831 16:14:26.688799    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 0
	I0831 16:14:26.688815    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:14:26.688917    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:14:26.690005    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:14:26.690088    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:14:26.690105    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:14:26.690144    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:14:26.690176    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:14:26.690201    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:14:26.690221    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:14:26.690246    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:14:26.690261    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:14:26.690275    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:14:26.690289    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:14:26.690304    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:14:26.690319    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:14:26.690357    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:14:26.690372    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:14:26.690387    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:14:26.690399    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:14:26.690409    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:14:26.690425    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:14:26.696325    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:26 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 16:14:26.827439    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:26 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 16:14:26.828050    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:26 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:14:26.828067    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:26 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:14:26.828087    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:26 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:14:26.828101    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:26 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:14:27.206273    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:27 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 16:14:27.206294    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:27 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 16:14:27.321341    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:14:27.321369    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:14:27.321379    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:14:27.321389    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:14:27.322101    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:27 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 16:14:27.322111    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:27 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 16:14:28.690977    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 1
	I0831 16:14:28.690991    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:14:28.691092    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:14:28.691876    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:14:28.691935    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:14:28.691961    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:14:28.691970    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:14:28.691981    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:14:28.691991    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:14:28.691996    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:14:28.692003    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:14:28.692011    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:14:28.692025    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:14:28.692032    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:14:28.692038    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:14:28.692046    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:14:28.692056    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:14:28.692064    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:14:28.692071    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:14:28.692084    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:14:28.692091    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:14:28.692101    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:14:30.693014    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 2
	I0831 16:14:30.693040    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:14:30.693112    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:14:30.693943    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:14:30.693980    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:14:30.693991    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:14:30.694000    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:14:30.694009    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:14:30.694021    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:14:30.694030    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:14:30.694045    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:14:30.694062    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:14:30.694069    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:14:30.694080    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:14:30.694088    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:14:30.694096    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:14:30.694102    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:14:30.694110    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:14:30.694117    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:14:30.694125    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:14:30.694135    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:14:30.694143    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:14:32.694694    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 3
	I0831 16:14:32.694710    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:14:32.694801    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:14:32.695588    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:14:32.695644    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:14:32.695656    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:14:32.695673    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:14:32.695681    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:14:32.695689    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:14:32.695695    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:14:32.695702    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:14:32.695709    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:14:32.695715    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:14:32.695721    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:14:32.695728    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:14:32.695758    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:14:32.695780    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:14:32.695792    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:14:32.695800    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:14:32.695809    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:14:32.695816    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:14:32.695834    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:14:32.699797    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:32 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 16:14:32.699959    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:32 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 16:14:32.699969    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:32 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 16:14:32.720009    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:14:32 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 16:14:34.696793    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 4
	I0831 16:14:34.696811    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:14:34.696892    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:14:34.697663    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:14:34.697727    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:14:34.697737    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:14:34.697749    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:14:34.697755    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:14:34.697762    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:14:34.697795    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:14:34.697805    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:14:34.697814    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:14:34.697821    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:14:34.697827    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:14:34.697837    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:14:34.697850    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:14:34.697859    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:14:34.697866    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:14:34.697873    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:14:34.697886    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:14:34.697898    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:14:34.697909    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:14:36.699951    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 5
	I0831 16:14:36.699964    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:14:36.700017    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:14:36.700798    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:14:36.700863    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:14:36.700872    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:14:36.700888    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:14:36.700900    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:14:36.700907    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:14:36.700914    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:14:36.700943    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:14:36.700958    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:14:36.700966    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:14:36.700973    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:14:36.700982    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:14:36.700992    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:14:36.701003    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:14:36.701018    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:14:36.701040    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:14:36.701051    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:14:36.701060    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:14:36.701083    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:14:38.702255    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 6
	I0831 16:14:38.702269    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:14:38.702398    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:14:38.703173    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:14:38.703229    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:14:38.703239    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:14:38.703248    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:14:38.703255    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:14:38.703262    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:14:38.703270    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:14:38.703290    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:14:38.703308    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:14:38.703319    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:14:38.703335    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:14:38.703348    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:14:38.703356    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:14:38.703363    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:14:38.703372    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:14:38.703380    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:14:38.703388    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:14:38.703396    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:14:38.703404    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:14:40.705383    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 7
	I0831 16:14:40.705397    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:14:40.705488    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:14:40.706277    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:14:40.706341    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:14:40.706351    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:14:40.706360    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:14:40.706366    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:14:40.706380    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:14:40.706390    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:14:40.706399    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:14:40.706410    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:14:40.706419    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:14:40.706439    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:14:40.706457    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:14:40.706471    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:14:40.706487    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:14:40.706495    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:14:40.706502    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:14:40.706508    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:14:40.706515    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:14:40.706528    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:14:42.707535    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 8
	I0831 16:14:42.707552    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:14:42.707601    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:14:42.708403    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:14:42.708429    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:14:42.708442    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:14:42.708450    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:14:42.708466    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:14:42.708476    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:14:42.708484    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:14:42.708493    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:14:42.708503    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:14:42.708513    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:14:42.708528    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:14:42.708540    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:14:42.708548    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:14:42.708557    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:14:42.708564    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:14:42.708570    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:14:42.708577    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:14:42.708595    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:14:42.708607    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:14:44.709206    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 9
	I0831 16:14:44.709221    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:14:44.709264    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:14:44.710022    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:14:44.710070    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:14:44.710089    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:14:44.710098    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:14:44.710104    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:14:44.710110    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:14:44.710115    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:14:44.710130    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:14:44.710142    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:14:44.710150    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:14:44.710158    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:14:44.710174    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:14:44.710182    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:14:44.710189    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:14:44.710197    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:14:44.710205    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:14:44.710213    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:14:44.710222    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:14:44.710235    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:14:46.711815    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 10
	I0831 16:14:46.711830    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:14:46.711899    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:14:46.712709    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:14:46.712758    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:14:46.712768    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:14:46.712785    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:14:46.712798    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:14:46.712817    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:14:46.712824    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:14:46.712831    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:14:46.712836    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:14:46.712843    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:14:46.712852    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:14:46.712860    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:14:46.712870    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:14:46.712877    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:14:46.712883    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:14:46.712889    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:14:46.712896    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:14:46.712906    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:14:46.712915    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:14:48.712914    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 11
	I0831 16:14:48.712937    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:14:48.712990    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:14:48.713754    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:14:48.713808    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:14:48.713817    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:14:48.713826    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:14:48.713832    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:14:48.713839    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:14:48.713844    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:14:48.713850    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:14:48.713855    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:14:48.713879    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:14:48.713890    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:14:48.713902    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:14:48.713912    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:14:48.713919    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:14:48.713930    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:14:48.713938    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:14:48.713946    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:14:48.713953    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:14:48.713960    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:14:50.715023    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 12
	I0831 16:14:50.715039    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:14:50.715106    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:14:50.715912    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:14:50.715957    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:14:50.715967    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:14:50.715981    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:14:50.715988    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:14:50.716007    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:14:50.716013    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:14:50.716041    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:14:50.716055    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:14:50.716063    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:14:50.716070    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:14:50.716078    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:14:50.716087    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:14:50.716095    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:14:50.716104    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:14:50.716113    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:14:50.716125    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:14:50.716135    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:14:50.716143    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:14:52.716294    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 13
	I0831 16:14:52.716310    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:14:52.716349    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:14:52.717136    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:14:52.717189    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:14:52.717202    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:14:52.717213    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:14:52.717219    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:14:52.717226    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:14:52.717233    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:14:52.717242    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:14:52.717250    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:14:52.717257    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:14:52.717266    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:14:52.717273    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:14:52.717280    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:14:52.717285    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:14:52.717294    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:14:52.717300    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:14:52.717318    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:14:52.717328    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:14:52.717336    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:14:54.719028    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 14
	I0831 16:14:54.719044    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:14:54.719119    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:14:54.719910    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:14:54.719951    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:14:54.719965    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:14:54.719978    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:14:54.719986    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:14:54.719996    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:14:54.720015    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:14:54.720027    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:14:54.720035    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:14:54.720043    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:14:54.720082    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:14:54.720103    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:14:54.720126    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:14:54.720136    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:14:54.720145    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:14:54.720153    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:14:54.720167    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:14:54.720184    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:14:54.720195    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:14:56.721645    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 15
	I0831 16:14:56.721661    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:14:56.721716    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:14:56.722512    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:14:56.722562    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:14:56.722581    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:14:56.722591    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:14:56.722598    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:14:56.722606    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:14:56.722614    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:14:56.722621    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:14:56.722627    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:14:56.722634    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:14:56.722640    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:14:56.722646    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:14:56.722653    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:14:56.722661    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:14:56.722668    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:14:56.722677    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:14:56.722684    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:14:56.722692    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:14:56.722700    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:14:58.724726    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 16
	I0831 16:14:58.724739    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:14:58.724838    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:14:58.725604    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:14:58.725658    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:14:58.725681    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:14:58.725690    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:14:58.725696    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:14:58.725714    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:14:58.725729    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:14:58.725736    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:14:58.725745    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:14:58.725752    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:14:58.725760    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:14:58.725766    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:14:58.725774    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:14:58.725790    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:14:58.725802    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:14:58.725809    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:14:58.725818    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:14:58.725835    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:14:58.725849    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:00.726065    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 17
	I0831 16:15:00.726078    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:00.726209    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:15:00.726996    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:15:00.727034    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:00.727042    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:00.727057    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:00.727063    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:00.727081    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:00.727091    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:00.727098    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:00.727104    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:00.727120    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:00.727128    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:00.727146    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:00.727158    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:00.727166    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:00.727172    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:00.727197    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:00.727211    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:00.727219    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:00.727228    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:02.728048    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 18
	I0831 16:15:02.728062    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:02.728188    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:15:02.729275    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:15:02.729311    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:02.729319    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:02.729350    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:02.729362    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:02.729374    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:02.729384    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:02.729405    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:02.729421    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:02.729433    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:02.729448    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:02.729458    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:02.729466    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:02.729474    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:02.729482    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:02.729489    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:02.729500    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:02.729513    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:02.729523    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:04.731500    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 19
	I0831 16:15:04.731517    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:04.731585    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:15:04.732373    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:15:04.732476    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:04.732484    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:04.732491    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:04.732497    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:04.732504    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:04.732510    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:04.732535    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:04.732546    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:04.732554    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:04.732572    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:04.732580    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:04.732588    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:04.732597    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:04.732612    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:04.732625    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:04.732636    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:04.732644    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:04.732653    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:06.733228    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 20
	I0831 16:15:06.733249    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:06.733325    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:15:06.734114    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:15:06.734161    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:06.734173    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:06.734195    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:06.734203    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:06.734211    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:06.734220    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:06.734239    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:06.734252    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:06.734262    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:06.734288    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:06.734304    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:06.734312    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:06.734319    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:06.734326    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:06.734334    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:06.734347    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:06.734356    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:06.734372    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:08.734522    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 21
	I0831 16:15:08.734534    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:08.734586    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:15:08.735373    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:15:08.735454    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:08.735488    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:08.735505    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:08.735512    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:08.735519    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:08.735525    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:08.735541    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:08.735551    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:08.735559    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:08.735568    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:08.735582    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:08.735595    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:08.735604    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:08.735625    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:08.735635    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:08.735642    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:08.735648    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:08.735655    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:10.737676    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 22
	I0831 16:15:10.737688    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:10.737740    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:15:10.738604    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:15:10.738651    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:10.738664    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:10.738674    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:10.738682    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:10.738689    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:10.738695    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:10.738702    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:10.738708    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:10.738715    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:10.738723    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:10.738731    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:10.738736    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:10.738758    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:10.738766    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:10.738774    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:10.738782    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:10.738790    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:10.738796    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:12.740904    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 23
	I0831 16:15:12.740920    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:12.740974    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:15:12.741784    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:15:12.741815    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:12.741826    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:12.741837    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:12.741844    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:12.741853    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:12.741861    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:12.741879    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:12.741893    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:12.741902    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:12.741909    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:12.741922    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:12.741931    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:12.741945    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:12.741958    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:12.741966    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:12.741974    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:12.741986    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:12.741994    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:14.743989    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 24
	I0831 16:15:14.744003    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:14.744167    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:15:14.745108    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:15:14.745142    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:14.745152    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:14.745161    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:14.745168    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:14.745183    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:14.745197    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:14.745208    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:14.745229    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:14.745242    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:14.745251    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:14.745259    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:14.745268    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:14.745285    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:14.745296    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:14.745315    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:14.745327    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:14.745335    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:14.745344    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:16.747357    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 25
	I0831 16:15:16.747370    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:16.747419    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:15:16.748282    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:15:16.748336    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:16.748345    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:16.748357    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:16.748363    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:16.748369    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:16.748374    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:16.748381    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:16.748386    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:16.748393    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:16.748401    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:16.748416    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:16.748431    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:16.748457    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:16.748474    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:16.748482    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:16.748490    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:16.748500    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:16.748509    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:18.750535    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 26
	I0831 16:15:18.750550    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:18.750605    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:15:18.751427    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:15:18.751479    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:18.751488    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:18.751503    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:18.751517    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:18.751526    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:18.751532    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:18.751538    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:18.751548    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:18.751556    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:18.751565    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:18.751573    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:18.751581    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:18.751588    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:18.751594    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:18.751607    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:18.751615    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:18.751623    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:18.751631    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:20.751922    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 27
	I0831 16:15:20.751936    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:20.752008    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:15:20.752794    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:15:20.752830    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:20.752838    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:20.752859    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:20.752876    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:20.752892    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:20.752905    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:20.752914    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:20.752922    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:20.752929    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:20.752949    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:20.752966    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:20.752975    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:20.752983    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:20.752991    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:20.753004    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:20.753013    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:20.753019    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:20.753028    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:22.755008    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 28
	I0831 16:15:22.755025    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:22.755098    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:15:22.756120    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:15:22.756164    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:22.756188    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:22.756199    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:22.756208    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:22.756216    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:22.756222    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:22.756228    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:22.756234    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:22.756247    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:22.756260    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:22.756274    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:22.756283    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:22.756291    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:22.756305    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:22.756322    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:22.756334    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:22.756349    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:22.756357    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:24.758284    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 29
	I0831 16:15:24.758298    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:24.758374    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:15:24.759153    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for ce:d1:31:a1:33:1f in /var/db/dhcpd_leases ...
	I0831 16:15:24.759209    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:24.759222    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:24.759235    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:24.759247    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:24.759257    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:24.759267    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:24.759275    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:24.759285    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:24.759298    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:24.759306    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:24.759312    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:24.759321    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:24.759330    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:24.759338    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:24.759345    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:24.759353    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:24.759360    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:24.759365    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:26.759814    5912 client.go:171] duration metric: took 1m1.338778904s to LocalClient.Create
	I0831 16:15:28.761953    5912 start.go:128] duration metric: took 1m3.372691934s to createHost
	I0831 16:15:28.761982    5912 start.go:83] releasing machines lock for "offline-docker-207000", held for 1m3.372872058s
	W0831 16:15:28.762035    5912 start.go:714] error starting host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for ce:d1:31:a1:33:1f
	I0831 16:15:28.762396    5912 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:15:28.762426    5912 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:15:28.771984    5912 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53674
	I0831 16:15:28.772499    5912 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:15:28.772957    5912 main.go:141] libmachine: Using API Version  1
	I0831 16:15:28.772971    5912 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:15:28.773213    5912 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:15:28.773620    5912 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:15:28.773663    5912 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:15:28.782310    5912 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53676
	I0831 16:15:28.782698    5912 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:15:28.783138    5912 main.go:141] libmachine: Using API Version  1
	I0831 16:15:28.783155    5912 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:15:28.783377    5912 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:15:28.783518    5912 main.go:141] libmachine: (offline-docker-207000) Calling .GetState
	I0831 16:15:28.783616    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:28.783709    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:15:28.784685    5912 main.go:141] libmachine: (offline-docker-207000) Calling .DriverName
	I0831 16:15:28.825457    5912 out.go:177] * Deleting "offline-docker-207000" in hyperkit ...
	I0831 16:15:28.883388    5912 main.go:141] libmachine: (offline-docker-207000) Calling .Remove
	I0831 16:15:28.883528    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:28.883540    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:28.883600    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:15:28.884546    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:28.884605    5912 main.go:141] libmachine: (offline-docker-207000) DBG | waiting for graceful shutdown
	I0831 16:15:29.884751    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:29.884848    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:15:29.885838    5912 main.go:141] libmachine: (offline-docker-207000) DBG | waiting for graceful shutdown
	I0831 16:15:30.886119    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:30.886260    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:15:30.887883    5912 main.go:141] libmachine: (offline-docker-207000) DBG | waiting for graceful shutdown
	I0831 16:15:31.889508    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:31.889580    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:15:31.890340    5912 main.go:141] libmachine: (offline-docker-207000) DBG | waiting for graceful shutdown
	I0831 16:15:32.892048    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:32.892263    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:15:32.893033    5912 main.go:141] libmachine: (offline-docker-207000) DBG | waiting for graceful shutdown
	I0831 16:15:33.893610    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:33.893670    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 5958
	I0831 16:15:33.894699    5912 main.go:141] libmachine: (offline-docker-207000) DBG | sending sigkill
	I0831 16:15:33.894710    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:33.907134    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:15:33 WARN : hyperkit: failed to read stdout: EOF
	I0831 16:15:33.907154    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:15:33 WARN : hyperkit: failed to read stderr: EOF
	W0831 16:15:33.923635    5912 out.go:270] ! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for ce:d1:31:a1:33:1f
	! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for ce:d1:31:a1:33:1f
	I0831 16:15:33.923649    5912 start.go:729] Will try again in 5 seconds ...
	I0831 16:15:38.923996    5912 start.go:360] acquireMachinesLock for offline-docker-207000: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 16:16:31.689561    5912 start.go:364] duration metric: took 52.765173316s to acquireMachinesLock for "offline-docker-207000"
	I0831 16:16:31.689602    5912 start.go:93] Provisioning new machine with config: &{Name:offline-docker-207000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesC
onfig:{KubernetesVersion:v1.31.0 ClusterName:offline-docker-207000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions
:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 16:16:31.689652    5912 start.go:125] createHost starting for "" (driver="hyperkit")
	I0831 16:16:31.711055    5912 out.go:235] * Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	I0831 16:16:31.711128    5912 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:16:31.711174    5912 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:16:31.719843    5912 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53684
	I0831 16:16:31.720198    5912 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:16:31.720571    5912 main.go:141] libmachine: Using API Version  1
	I0831 16:16:31.720588    5912 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:16:31.720808    5912 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:16:31.720926    5912 main.go:141] libmachine: (offline-docker-207000) Calling .GetMachineName
	I0831 16:16:31.721019    5912 main.go:141] libmachine: (offline-docker-207000) Calling .DriverName
	I0831 16:16:31.721126    5912 start.go:159] libmachine.API.Create for "offline-docker-207000" (driver="hyperkit")
	I0831 16:16:31.721146    5912 client.go:168] LocalClient.Create starting
	I0831 16:16:31.721173    5912 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem
	I0831 16:16:31.721225    5912 main.go:141] libmachine: Decoding PEM data...
	I0831 16:16:31.721235    5912 main.go:141] libmachine: Parsing certificate...
	I0831 16:16:31.721274    5912 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem
	I0831 16:16:31.721310    5912 main.go:141] libmachine: Decoding PEM data...
	I0831 16:16:31.721324    5912 main.go:141] libmachine: Parsing certificate...
	I0831 16:16:31.721337    5912 main.go:141] libmachine: Running pre-create checks...
	I0831 16:16:31.721343    5912 main.go:141] libmachine: (offline-docker-207000) Calling .PreCreateCheck
	I0831 16:16:31.721414    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:31.721447    5912 main.go:141] libmachine: (offline-docker-207000) Calling .GetConfigRaw
	I0831 16:16:31.753991    5912 main.go:141] libmachine: Creating machine...
	I0831 16:16:31.754001    5912 main.go:141] libmachine: (offline-docker-207000) Calling .Create
	I0831 16:16:31.754088    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:31.754281    5912 main.go:141] libmachine: (offline-docker-207000) DBG | I0831 16:16:31.754081    6115 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 16:16:31.754322    5912 main.go:141] libmachine: (offline-docker-207000) Downloading /Users/jenkins/minikube-integration/18943-957/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso...
	I0831 16:16:31.960539    5912 main.go:141] libmachine: (offline-docker-207000) DBG | I0831 16:16:31.960438    6115 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/id_rsa...
	I0831 16:16:32.006443    5912 main.go:141] libmachine: (offline-docker-207000) DBG | I0831 16:16:32.006369    6115 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/offline-docker-207000.rawdisk...
	I0831 16:16:32.006458    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Writing magic tar header
	I0831 16:16:32.006467    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Writing SSH key tar header
	I0831 16:16:32.006852    5912 main.go:141] libmachine: (offline-docker-207000) DBG | I0831 16:16:32.006814    6115 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000 ...
	I0831 16:16:32.366900    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:32.366923    5912 main.go:141] libmachine: (offline-docker-207000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/hyperkit.pid
	I0831 16:16:32.366967    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Using UUID 40ae9f30-eecc-4864-86b9-2538178f7571
	I0831 16:16:32.391891    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Generated MAC d6:9a:ac:6:65:b0
	I0831 16:16:32.391908    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=offline-docker-207000
	I0831 16:16:32.391941    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:32 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"40ae9f30-eecc-4864-86b9-2538178f7571", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"
", process:(*os.Process)(nil)}
	I0831 16:16:32.391977    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:32 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"40ae9f30-eecc-4864-86b9-2538178f7571", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"
", process:(*os.Process)(nil)}
	I0831 16:16:32.392041    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:32 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/hyperkit.pid", "-c", "2", "-m", "2048M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "40ae9f30-eecc-4864-86b9-2538178f7571", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/offline-docker-207000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/bzimage,/Users
/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=offline-docker-207000"}
	I0831 16:16:32.392089    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:32 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/hyperkit.pid -c 2 -m 2048M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 40ae9f30-eecc-4864-86b9-2538178f7571 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/offline-docker-207000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/off
line-docker-207000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=offline-docker-207000"
	I0831 16:16:32.392099    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:32 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 16:16:32.395725    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:32 DEBUG: hyperkit: Pid is 6116
	I0831 16:16:32.396371    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 0
	I0831 16:16:32.396396    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:32.396516    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:16:32.397356    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:16:32.397416    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:32.397429    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:32.397445    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:32.397455    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:32.397476    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:32.397486    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:32.397508    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:32.397532    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:32.397550    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:32.397560    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:32.397573    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:32.397582    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:32.397592    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:32.397599    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:32.397606    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:32.397614    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:32.397623    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:32.397633    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:32.402747    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:32 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 16:16:32.410996    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:32 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/offline-docker-207000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 16:16:32.411907    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:32 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:16:32.411928    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:32 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:16:32.411943    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:32 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:16:32.411949    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:32 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:16:32.788453    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:32 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 16:16:32.788465    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:32 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 16:16:32.903788    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:32 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:16:32.903801    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:32 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:16:32.903812    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:32 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:16:32.903858    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:32 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:16:32.904806    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:32 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 16:16:32.904822    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:32 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 16:16:34.398362    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 1
	I0831 16:16:34.398376    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:34.398477    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:16:34.399258    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:16:34.399330    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:34.399342    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:34.399371    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:34.399380    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:34.399388    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:34.399396    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:34.399403    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:34.399411    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:34.399419    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:34.399428    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:34.399434    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:34.399440    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:34.399446    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:34.399453    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:34.399462    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:34.399470    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:34.399476    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:34.399497    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:36.400635    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 2
	I0831 16:16:36.400654    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:36.400739    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:16:36.401522    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:16:36.401574    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:36.401585    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:36.401594    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:36.401601    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:36.401617    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:36.401630    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:36.401638    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:36.401651    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:36.401659    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:36.401668    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:36.401679    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:36.401689    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:36.401701    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:36.401713    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:36.401721    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:36.401728    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:36.401735    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:36.401744    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:38.273031    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:38 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0831 16:16:38.273140    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:38 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0831 16:16:38.273149    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:38 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0831 16:16:38.293905    5912 main.go:141] libmachine: (offline-docker-207000) DBG | 2024/08/31 16:16:38 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0831 16:16:38.403879    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 3
	I0831 16:16:38.403903    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:38.404140    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:16:38.405670    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:16:38.405770    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:38.405822    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:38.405839    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:38.405853    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:38.405867    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:38.405878    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:38.405891    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:38.405915    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:38.405933    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:38.405946    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:38.405955    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:38.405967    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:38.405980    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:38.405991    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:38.406002    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:38.406017    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:38.406028    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:38.406039    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:40.406060    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 4
	I0831 16:16:40.406076    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:40.406142    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:16:40.406933    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:16:40.407000    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:40.407012    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:40.407033    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:40.407044    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:40.407058    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:40.407066    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:40.407073    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:40.407079    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:40.407085    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:40.407092    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:40.407099    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:40.407105    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:40.407111    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:40.407120    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:40.407141    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:40.407153    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:40.407162    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:40.407171    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:42.409224    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 5
	I0831 16:16:42.409256    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:42.409270    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:16:42.410047    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:16:42.410087    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:42.410102    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:42.410115    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:42.410124    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:42.410146    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:42.410159    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:42.410170    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:42.410178    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:42.410185    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:42.410192    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:42.410207    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:42.410220    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:42.410234    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:42.410244    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:42.410252    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:42.410262    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:42.410269    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:42.410277    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:44.412274    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 6
	I0831 16:16:44.412290    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:44.412340    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:16:44.413252    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:16:44.413305    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:44.413317    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:44.413327    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:44.413334    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:44.413341    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:44.413356    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:44.413364    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:44.413372    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:44.413380    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:44.413388    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:44.413406    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:44.413418    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:44.413428    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:44.413437    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:44.413446    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:44.413454    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:44.413461    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:44.413479    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:46.414667    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 7
	I0831 16:16:46.414681    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:46.414691    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:16:46.415514    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:16:46.415546    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:46.415564    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:46.415574    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:46.415581    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:46.415589    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:46.415597    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:46.415607    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:46.415616    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:46.415627    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:46.415634    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:46.415643    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:46.415650    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:46.415661    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:46.415678    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:46.415687    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:46.415695    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:46.415704    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:46.415721    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:48.417246    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 8
	I0831 16:16:48.417263    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:48.417329    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:16:48.418135    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:16:48.418188    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:48.418201    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:48.418229    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:48.418251    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:48.418261    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:48.418270    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:48.418278    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:48.418292    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:48.418304    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:48.418316    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:48.418331    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:48.418343    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:48.418354    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:48.418359    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:48.418375    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:48.418387    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:48.418405    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:48.418419    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:50.420406    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 9
	I0831 16:16:50.420422    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:50.420460    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:16:50.421252    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:16:50.421285    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:50.421298    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:50.421328    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:50.421340    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:50.421348    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:50.421355    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:50.421363    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:50.421371    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:50.421378    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:50.421386    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:50.421401    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:50.421413    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:50.421424    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:50.421432    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:50.421440    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:50.421446    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:50.421452    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:50.421461    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:52.423512    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 10
	I0831 16:16:52.423528    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:52.423591    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:16:52.424397    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:16:52.424444    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:52.424455    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:52.424465    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:52.424472    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:52.424483    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:52.424490    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:52.424497    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:52.424503    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:52.424510    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:52.424516    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:52.424523    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:52.424531    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:52.424540    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:52.424547    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:52.424556    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:52.424564    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:52.424580    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:52.424588    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:54.426285    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 11
	I0831 16:16:54.426300    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:54.426375    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:16:54.427173    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:16:54.427222    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:54.427232    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:54.427240    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:54.427248    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:54.427261    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:54.427274    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:54.427284    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:54.427294    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:54.427306    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:54.427314    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:54.427322    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:54.427328    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:54.427337    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:54.427350    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:54.427361    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:54.427368    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:54.427376    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:54.427387    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:56.429403    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 12
	I0831 16:16:56.429418    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:56.429479    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:16:56.430297    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:16:56.430342    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:56.430353    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:56.430375    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:56.430386    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:56.430398    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:56.430412    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:56.430420    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:56.430428    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:56.430436    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:56.430444    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:56.430450    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:56.430457    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:56.430470    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:56.430484    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:56.430492    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:56.430500    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:56.430508    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:56.430515    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:58.431864    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 13
	I0831 16:16:58.431881    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:58.431951    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:16:58.432776    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:16:58.432833    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:58.432846    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:58.432855    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:58.432861    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:58.432876    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:58.432895    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:58.432903    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:58.432910    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:58.432919    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:58.432925    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:58.432934    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:58.432943    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:58.432951    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:58.432958    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:58.432966    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:58.432980    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:58.432991    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:58.433002    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:00.433157    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 14
	I0831 16:17:00.433173    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:00.433238    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:17:00.434041    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:17:00.434099    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:00.434112    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:00.434142    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:00.434149    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:00.434155    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:00.434165    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:00.434174    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:00.434181    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:00.434187    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:00.434193    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:00.434199    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:00.434205    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:00.434224    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:00.434236    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:00.434245    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:00.434253    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:00.434260    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:00.434268    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:02.436299    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 15
	I0831 16:17:02.436326    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:02.436393    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:17:02.437231    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:17:02.437272    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:02.437280    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:02.437313    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:02.437333    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:02.437351    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:02.437360    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:02.437375    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:02.437391    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:02.437407    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:02.437421    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:02.437437    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:02.437446    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:02.437456    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:02.437465    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:02.437472    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:02.437480    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:02.437486    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:02.437494    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:04.438572    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 16
	I0831 16:17:04.438585    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:04.438649    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:17:04.439551    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:17:04.439600    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:04.439610    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:04.439620    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:04.439627    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:04.439633    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:04.439639    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:04.439646    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:04.439655    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:04.439662    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:04.439682    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:04.439694    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:04.439706    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:04.439714    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:04.439723    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:04.439737    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:04.439745    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:04.439755    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:04.439763    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:06.440130    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 17
	I0831 16:17:06.440145    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:06.440194    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:17:06.441012    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:17:06.441067    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:06.441079    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:06.441101    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:06.441112    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:06.441123    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:06.441134    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:06.441150    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:06.441164    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:06.441172    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:06.441179    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:06.441201    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:06.441222    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:06.441238    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:06.441252    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:06.441262    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:06.441271    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:06.441282    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:06.441292    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:08.442051    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 18
	I0831 16:17:08.442064    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:08.442097    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:17:08.442905    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:17:08.442920    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:08.442936    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:08.442954    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:08.442961    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:08.442989    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:08.443000    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:08.443013    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:08.443034    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:08.443049    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:08.443058    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:08.443065    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:08.443073    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:08.443081    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:08.443097    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:08.443105    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:08.443112    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:08.443119    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:08.443127    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:10.443430    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 19
	I0831 16:17:10.443446    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:10.443517    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:17:10.444330    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:17:10.444371    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:10.444379    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:10.444390    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:10.444396    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:10.444403    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:10.444409    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:10.444415    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:10.444421    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:10.444433    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:10.444441    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:10.444448    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:10.444456    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:10.444463    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:10.444472    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:10.444479    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:10.444488    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:10.444495    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:10.444501    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:12.445339    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 20
	I0831 16:17:12.445356    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:12.445381    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:17:12.446179    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:17:12.446243    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:12.446254    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:12.446262    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:12.446269    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:12.446276    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:12.446285    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:12.446291    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:12.446300    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:12.446307    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:12.446316    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:12.446324    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:12.446333    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:12.446340    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:12.446347    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:12.446355    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:12.446362    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:12.446380    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:12.446392    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:14.448421    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 21
	I0831 16:17:14.448441    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:14.448508    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:17:14.449331    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:17:14.449392    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:14.449402    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:14.449411    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:14.449418    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:14.449425    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:14.449431    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:14.449445    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:14.449452    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:14.449461    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:14.449468    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:14.449476    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:14.449492    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:14.449504    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:14.449514    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:14.449523    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:14.449537    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:14.449546    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:14.449560    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:16.450082    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 22
	I0831 16:17:16.450097    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:16.450161    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:17:16.450987    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:17:16.451032    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:16.451042    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:16.451064    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:16.451072    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:16.451078    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:16.451084    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:16.451090    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:16.451097    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:16.451105    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:16.451113    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:16.451120    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:16.451127    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:16.451142    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:16.451151    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:16.451159    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:16.451167    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:16.451174    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:16.451182    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:18.453239    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 23
	I0831 16:17:18.453254    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:18.453321    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:17:18.454173    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:17:18.454226    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:18.454246    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:18.454274    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:18.454284    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:18.454300    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:18.454315    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:18.454323    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:18.454331    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:18.454344    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:18.454351    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:18.454359    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:18.454370    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:18.454379    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:18.454388    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:18.454396    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:18.454405    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:18.454413    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:18.454425    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:20.456404    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 24
	I0831 16:17:20.456416    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:20.456504    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:17:20.457311    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:17:20.457354    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:20.457362    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:20.457373    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:20.457385    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:20.457393    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:20.457402    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:20.457410    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:20.457417    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:20.457424    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:20.457434    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:20.457457    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:20.457475    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:20.457490    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:20.457503    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:20.457511    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:20.457519    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:20.457534    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:20.457549    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:22.458543    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 25
	I0831 16:17:22.458568    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:22.458623    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:17:22.459419    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:17:22.459464    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:22.459477    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:22.459506    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:22.459521    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:22.459527    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:22.459535    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:22.459543    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:22.459550    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:22.459556    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:22.459569    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:22.459583    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:22.459592    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:22.459600    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:22.459607    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:22.459615    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:22.459622    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:22.459630    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:22.459641    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:24.461658    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 26
	I0831 16:17:24.461670    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:24.461774    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:17:24.462532    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:17:24.462593    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:24.462607    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:24.462616    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:24.462623    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:24.462629    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:24.462635    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:24.462664    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:24.462673    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:24.462680    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:24.462699    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:24.462711    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:24.462719    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:24.462727    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:24.462734    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:24.462750    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:24.462774    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:24.462786    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:24.462796    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:26.463277    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 27
	I0831 16:17:26.463292    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:26.463427    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:17:26.464197    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:17:26.464253    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:26.464265    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:26.464273    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:26.464280    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:26.464289    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:26.464300    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:26.464309    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:26.464316    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:26.464322    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:26.464330    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:26.464337    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:26.464346    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:26.464353    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:26.464360    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:26.464368    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:26.464377    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:26.464393    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:26.464406    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:28.465261    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 28
	I0831 16:17:28.465274    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:28.465342    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:17:28.466157    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:17:28.466208    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:28.466217    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:28.466227    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:28.466244    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:28.466252    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:28.466258    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:28.466266    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:28.466273    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:28.466284    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:28.466300    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:28.466312    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:28.466320    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:28.466328    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:28.466348    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:28.466367    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:28.466376    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:28.466383    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:28.466398    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:30.466721    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Attempt 29
	I0831 16:17:30.466739    5912 main.go:141] libmachine: (offline-docker-207000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:30.466791    5912 main.go:141] libmachine: (offline-docker-207000) DBG | hyperkit pid from json: 6116
	I0831 16:17:30.467664    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Searching for d6:9a:ac:6:65:b0 in /var/db/dhcpd_leases ...
	I0831 16:17:30.467716    5912 main.go:141] libmachine: (offline-docker-207000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:30.467738    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:30.467764    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:30.467779    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:30.467787    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:30.467794    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:30.467808    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:30.467819    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:30.467827    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:30.467835    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:30.467843    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:30.467851    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:30.467858    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:30.467867    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:30.467880    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:30.467887    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:30.467893    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:30.467901    5912 main.go:141] libmachine: (offline-docker-207000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:32.469130    5912 client.go:171] duration metric: took 1m0.74757698s to LocalClient.Create
	I0831 16:17:34.471293    5912 start.go:128] duration metric: took 1m2.781218831s to createHost
	I0831 16:17:34.471321    5912 start.go:83] releasing machines lock for "offline-docker-207000", held for 1m2.781319514s
	W0831 16:17:34.471468    5912 out.go:270] * Failed to start hyperkit VM. Running "minikube delete -p offline-docker-207000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for d6:9a:ac:6:65:b0
	* Failed to start hyperkit VM. Running "minikube delete -p offline-docker-207000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for d6:9a:ac:6:65:b0
	I0831 16:17:34.534780    5912 out.go:201] 
	W0831 16:17:34.555696    5912 out.go:270] X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for d6:9a:ac:6:65:b0
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for d6:9a:ac:6:65:b0
	W0831 16:17:34.555724    5912 out.go:270] * 
	* 
	W0831 16:17:34.556396    5912 out.go:293] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0831 16:17:34.618622    5912 out.go:201] 

                                                
                                                
** /stderr **
aab_offline_test.go:58: out/minikube-darwin-amd64 start -p offline-docker-207000 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=hyperkit  failed: exit status 80
panic.go:626: *** TestOffline FAILED at 2024-08-31 16:17:34.727713 -0700 PDT m=+4351.924988299
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p offline-docker-207000 -n offline-docker-207000
helpers_test.go:240: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p offline-docker-207000 -n offline-docker-207000: exit status 7 (80.994666ms)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0831 16:17:34.806741    6140 status.go:352] failed to get driver ip: getting IP: IP address is not set
	E0831 16:17:34.806762    6140 status.go:249] status error: getting IP: IP address is not set

                                                
                                                
** /stderr **
helpers_test.go:240: status error: exit status 7 (may be ok)
helpers_test.go:242: "offline-docker-207000" host is not running, skipping log retrieval (state="Error")
helpers_test.go:176: Cleaning up "offline-docker-207000" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 delete -p offline-docker-207000
helpers_test.go:179: (dbg) Done: out/minikube-darwin-amd64 delete -p offline-docker-207000: (5.246186665s)
--- FAIL: TestOffline (195.21s)

                                                
                                    
x
+
TestAddons/parallel/Registry (74.03s)

                                                
                                                
=== RUN   TestAddons/parallel/Registry
=== PAUSE TestAddons/parallel/Registry

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:332: registry stabilized in 1.678761ms
addons_test.go:334: (dbg) TestAddons/parallel/Registry: waiting 6m0s for pods matching "actual-registry=true" in namespace "kube-system" ...
helpers_test.go:345: "registry-6fb4cdfc84-hbr57" [d2171259-b754-493c-a539-6115a91bf784] Running
addons_test.go:334: (dbg) TestAddons/parallel/Registry: actual-registry=true healthy within 5.005593656s
addons_test.go:337: (dbg) TestAddons/parallel/Registry: waiting 10m0s for pods matching "registry-proxy=true" in namespace "kube-system" ...
helpers_test.go:345: "registry-proxy-j5x8q" [97be67b8-2585-454b-bdbb-6a388e9592e6] Running
addons_test.go:337: (dbg) TestAddons/parallel/Registry: registry-proxy=true healthy within 5.004692421s
addons_test.go:342: (dbg) Run:  kubectl --context addons-540000 delete po -l run=registry-test --now
addons_test.go:347: (dbg) Run:  kubectl --context addons-540000 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local"
addons_test.go:347: (dbg) Non-zero exit: kubectl --context addons-540000 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local": exit status 1 (1m0.060846668s)

                                                
                                                
-- stdout --
	pod "registry-test" deleted

                                                
                                                
-- /stdout --
** stderr ** 
	error: timed out waiting for the condition

                                                
                                                
** /stderr **
addons_test.go:349: failed to hit registry.kube-system.svc.cluster.local. args "kubectl --context addons-540000 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c \"wget --spider -S http://registry.kube-system.svc.cluster.local\"" failed: exit status 1
addons_test.go:353: expected curl response be "HTTP/1.1 200", but got *pod "registry-test" deleted
*
addons_test.go:361: (dbg) Run:  out/minikube-darwin-amd64 -p addons-540000 ip
2024/08/31 15:19:09 [DEBUG] GET http://192.169.0.2:5000
addons_test.go:390: (dbg) Run:  out/minikube-darwin-amd64 -p addons-540000 addons disable registry --alsologtostderr -v=1
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p addons-540000 -n addons-540000
helpers_test.go:245: <<< TestAddons/parallel/Registry FAILED: start of post-mortem logs <<<
helpers_test.go:246: ======>  post-mortem[TestAddons/parallel/Registry]: minikube logs <======
helpers_test.go:248: (dbg) Run:  out/minikube-darwin-amd64 -p addons-540000 logs -n 25
helpers_test.go:248: (dbg) Done: out/minikube-darwin-amd64 -p addons-540000 logs -n 25: (2.674572805s)
helpers_test.go:253: TestAddons/parallel/Registry logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |       Profile        |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| start   | -o=json --download-only              | download-only-798000 | jenkins | v1.33.1 | 31 Aug 24 15:05 PDT |                     |
	|         | -p download-only-798000              |                      |         |         |                     |                     |
	|         | --force --alsologtostderr            |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.20.0         |                      |         |         |                     |                     |
	|         | --container-runtime=docker           |                      |         |         |                     |                     |
	|         | --driver=hyperkit                    |                      |         |         |                     |                     |
	| delete  | --all                                | minikube             | jenkins | v1.33.1 | 31 Aug 24 15:05 PDT | 31 Aug 24 15:05 PDT |
	| delete  | -p download-only-798000              | download-only-798000 | jenkins | v1.33.1 | 31 Aug 24 15:05 PDT | 31 Aug 24 15:05 PDT |
	| start   | -o=json --download-only              | download-only-982000 | jenkins | v1.33.1 | 31 Aug 24 15:05 PDT |                     |
	|         | -p download-only-982000              |                      |         |         |                     |                     |
	|         | --force --alsologtostderr            |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.31.0         |                      |         |         |                     |                     |
	|         | --container-runtime=docker           |                      |         |         |                     |                     |
	|         | --driver=hyperkit                    |                      |         |         |                     |                     |
	| delete  | --all                                | minikube             | jenkins | v1.33.1 | 31 Aug 24 15:05 PDT | 31 Aug 24 15:05 PDT |
	| delete  | -p download-only-982000              | download-only-982000 | jenkins | v1.33.1 | 31 Aug 24 15:05 PDT | 31 Aug 24 15:05 PDT |
	| delete  | -p download-only-798000              | download-only-798000 | jenkins | v1.33.1 | 31 Aug 24 15:05 PDT | 31 Aug 24 15:05 PDT |
	| delete  | -p download-only-982000              | download-only-982000 | jenkins | v1.33.1 | 31 Aug 24 15:05 PDT | 31 Aug 24 15:05 PDT |
	| start   | --download-only -p                   | binary-mirror-866000 | jenkins | v1.33.1 | 31 Aug 24 15:05 PDT |                     |
	|         | binary-mirror-866000                 |                      |         |         |                     |                     |
	|         | --alsologtostderr                    |                      |         |         |                     |                     |
	|         | --binary-mirror                      |                      |         |         |                     |                     |
	|         | http://127.0.0.1:49637               |                      |         |         |                     |                     |
	|         | --driver=hyperkit                    |                      |         |         |                     |                     |
	| delete  | -p binary-mirror-866000              | binary-mirror-866000 | jenkins | v1.33.1 | 31 Aug 24 15:05 PDT | 31 Aug 24 15:05 PDT |
	| addons  | enable dashboard -p                  | addons-540000        | jenkins | v1.33.1 | 31 Aug 24 15:05 PDT |                     |
	|         | addons-540000                        |                      |         |         |                     |                     |
	| addons  | disable dashboard -p                 | addons-540000        | jenkins | v1.33.1 | 31 Aug 24 15:05 PDT |                     |
	|         | addons-540000                        |                      |         |         |                     |                     |
	| start   | -p addons-540000 --wait=true         | addons-540000        | jenkins | v1.33.1 | 31 Aug 24 15:05 PDT | 31 Aug 24 15:09 PDT |
	|         | --memory=4000 --alsologtostderr      |                      |         |         |                     |                     |
	|         | --addons=registry                    |                      |         |         |                     |                     |
	|         | --addons=metrics-server              |                      |         |         |                     |                     |
	|         | --addons=volumesnapshots             |                      |         |         |                     |                     |
	|         | --addons=csi-hostpath-driver         |                      |         |         |                     |                     |
	|         | --addons=gcp-auth                    |                      |         |         |                     |                     |
	|         | --addons=cloud-spanner               |                      |         |         |                     |                     |
	|         | --addons=inspektor-gadget            |                      |         |         |                     |                     |
	|         | --addons=storage-provisioner-rancher |                      |         |         |                     |                     |
	|         | --addons=nvidia-device-plugin        |                      |         |         |                     |                     |
	|         | --addons=yakd --addons=volcano       |                      |         |         |                     |                     |
	|         | --driver=hyperkit  --addons=ingress  |                      |         |         |                     |                     |
	|         | --addons=ingress-dns                 |                      |         |         |                     |                     |
	|         | --addons=helm-tiller                 |                      |         |         |                     |                     |
	| addons  | addons-540000 addons disable         | addons-540000        | jenkins | v1.33.1 | 31 Aug 24 15:09 PDT | 31 Aug 24 15:09 PDT |
	|         | volcano --alsologtostderr -v=1       |                      |         |         |                     |                     |
	| addons  | addons-540000 addons                 | addons-540000        | jenkins | v1.33.1 | 31 Aug 24 15:18 PDT | 31 Aug 24 15:18 PDT |
	|         | disable csi-hostpath-driver          |                      |         |         |                     |                     |
	|         | --alsologtostderr -v=1               |                      |         |         |                     |                     |
	| addons  | addons-540000 addons                 | addons-540000        | jenkins | v1.33.1 | 31 Aug 24 15:18 PDT | 31 Aug 24 15:18 PDT |
	|         | disable volumesnapshots              |                      |         |         |                     |                     |
	|         | --alsologtostderr -v=1               |                      |         |         |                     |                     |
	| addons  | addons-540000 addons disable         | addons-540000        | jenkins | v1.33.1 | 31 Aug 24 15:18 PDT | 31 Aug 24 15:18 PDT |
	|         | helm-tiller --alsologtostderr        |                      |         |         |                     |                     |
	|         | -v=1                                 |                      |         |         |                     |                     |
	| addons  | addons-540000 addons                 | addons-540000        | jenkins | v1.33.1 | 31 Aug 24 15:18 PDT | 31 Aug 24 15:18 PDT |
	|         | disable metrics-server               |                      |         |         |                     |                     |
	|         | --alsologtostderr -v=1               |                      |         |         |                     |                     |
	| addons  | disable inspektor-gadget -p          | addons-540000        | jenkins | v1.33.1 | 31 Aug 24 15:19 PDT | 31 Aug 24 15:19 PDT |
	|         | addons-540000                        |                      |         |         |                     |                     |
	| ip      | addons-540000 ip                     | addons-540000        | jenkins | v1.33.1 | 31 Aug 24 15:19 PDT | 31 Aug 24 15:19 PDT |
	| addons  | addons-540000 addons disable         | addons-540000        | jenkins | v1.33.1 | 31 Aug 24 15:19 PDT | 31 Aug 24 15:19 PDT |
	|         | registry --alsologtostderr           |                      |         |         |                     |                     |
	|         | -v=1                                 |                      |         |         |                     |                     |
	|---------|--------------------------------------|----------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/08/31 15:05:27
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0831 15:05:27.663790    1563 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:05:27.663971    1563 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:05:27.663977    1563 out.go:358] Setting ErrFile to fd 2...
	I0831 15:05:27.663981    1563 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:05:27.664158    1563 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:05:27.665595    1563 out.go:352] Setting JSON to false
	I0831 15:05:27.687535    1563 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":298,"bootTime":1725141629,"procs":412,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.6.1","kernelVersion":"23.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0831 15:05:27.687622    1563 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0831 15:05:27.708785    1563 out.go:177] * [addons-540000] minikube v1.33.1 on Darwin 14.6.1
	I0831 15:05:27.751709    1563 out.go:177]   - MINIKUBE_LOCATION=18943
	I0831 15:05:27.751782    1563 notify.go:220] Checking for updates...
	I0831 15:05:27.793317    1563 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:05:27.814808    1563 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0831 15:05:27.835728    1563 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0831 15:05:27.856755    1563 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:05:27.877585    1563 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0831 15:05:27.899014    1563 driver.go:392] Setting default libvirt URI to qemu:///system
	I0831 15:05:27.930591    1563 out.go:177] * Using the hyperkit driver based on user configuration
	I0831 15:05:27.972585    1563 start.go:297] selected driver: hyperkit
	I0831 15:05:27.972615    1563 start.go:901] validating driver "hyperkit" against <nil>
	I0831 15:05:27.972638    1563 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0831 15:05:27.976930    1563 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:05:27.977046    1563 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18943-957/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0831 15:05:27.985276    1563 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0831 15:05:27.989074    1563 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:05:27.989095    1563 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0831 15:05:27.989124    1563 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0831 15:05:27.989312    1563 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:05:27.989345    1563 cni.go:84] Creating CNI manager for ""
	I0831 15:05:27.989359    1563 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0831 15:05:27.989364    1563 start_flags.go:319] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0831 15:05:27.989425    1563 start.go:340] cluster config:
	{Name:addons-540000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:4000 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 ClusterName:addons-540000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRunti
me:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHA
gentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:05:27.989513    1563 iso.go:125] acquiring lock: {Name:mk6e91575b208577856769ef01f8e000bc57c787 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:05:28.031751    1563 out.go:177] * Starting "addons-540000" primary control-plane node in "addons-540000" cluster
	I0831 15:05:28.052510    1563 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:05:28.052580    1563 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4
	I0831 15:05:28.052607    1563 cache.go:56] Caching tarball of preloaded images
	I0831 15:05:28.052822    1563 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:05:28.052843    1563 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:05:28.053313    1563 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/config.json ...
	I0831 15:05:28.053355    1563 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/config.json: {Name:mk3bf14dee5fbcd3e3563606958c277b49ae604f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:05:28.054027    1563 start.go:360] acquireMachinesLock for addons-540000: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:05:28.054253    1563 start.go:364] duration metric: took 206.411µs to acquireMachinesLock for "addons-540000"
	I0831 15:05:28.054295    1563 start.go:93] Provisioning new machine with config: &{Name:addons-540000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:4000 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{K
ubernetesVersion:v1.31.0 ClusterName:addons-540000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0
MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:05:28.054399    1563 start.go:125] createHost starting for "" (driver="hyperkit")
	I0831 15:05:28.075884    1563 out.go:235] * Creating hyperkit VM (CPUs=2, Memory=4000MB, Disk=20000MB) ...
	I0831 15:05:28.076112    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:05:28.076161    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:05:28.085063    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49644
	I0831 15:05:28.085409    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:05:28.085816    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:05:28.085833    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:05:28.086035    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:05:28.086160    1563 main.go:141] libmachine: (addons-540000) Calling .GetMachineName
	I0831 15:05:28.086251    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:05:28.086362    1563 start.go:159] libmachine.API.Create for "addons-540000" (driver="hyperkit")
	I0831 15:05:28.086384    1563 client.go:168] LocalClient.Create starting
	I0831 15:05:28.086426    1563 main.go:141] libmachine: Creating CA: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem
	I0831 15:05:28.297223    1563 main.go:141] libmachine: Creating client certificate: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem
	I0831 15:05:28.425715    1563 main.go:141] libmachine: Running pre-create checks...
	I0831 15:05:28.425727    1563 main.go:141] libmachine: (addons-540000) Calling .PreCreateCheck
	I0831 15:05:28.425853    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:05:28.426041    1563 main.go:141] libmachine: (addons-540000) Calling .GetConfigRaw
	I0831 15:05:28.426456    1563 main.go:141] libmachine: Creating machine...
	I0831 15:05:28.426470    1563 main.go:141] libmachine: (addons-540000) Calling .Create
	I0831 15:05:28.426561    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:05:28.426683    1563 main.go:141] libmachine: (addons-540000) DBG | I0831 15:05:28.426553    1571 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:05:28.426740    1563 main.go:141] libmachine: (addons-540000) Downloading /Users/jenkins/minikube-integration/18943-957/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso...
	I0831 15:05:28.649868    1563 main.go:141] libmachine: (addons-540000) DBG | I0831 15:05:28.649778    1571 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/id_rsa...
	I0831 15:05:28.704999    1563 main.go:141] libmachine: (addons-540000) DBG | I0831 15:05:28.704921    1571 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/addons-540000.rawdisk...
	I0831 15:05:28.705011    1563 main.go:141] libmachine: (addons-540000) DBG | Writing magic tar header
	I0831 15:05:28.705018    1563 main.go:141] libmachine: (addons-540000) DBG | Writing SSH key tar header
	I0831 15:05:28.705916    1563 main.go:141] libmachine: (addons-540000) DBG | I0831 15:05:28.705821    1571 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000 ...
	I0831 15:05:29.069285    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:05:29.069302    1563 main.go:141] libmachine: (addons-540000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/hyperkit.pid
	I0831 15:05:29.069312    1563 main.go:141] libmachine: (addons-540000) DBG | Using UUID 5cb961c6-b13a-45fd-9f1a-7b22b1f4e295
	I0831 15:05:29.336846    1563 main.go:141] libmachine: (addons-540000) DBG | Generated MAC 3a:82:3b:14:54:13
	I0831 15:05:29.336867    1563 main.go:141] libmachine: (addons-540000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=addons-540000
	I0831 15:05:29.336906    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:29 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"5cb961c6-b13a-45fd-9f1a-7b22b1f4e295", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/initrd", Bootrom:"", CPUs:2, Memory:4000, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:05:29.336941    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:29 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"5cb961c6-b13a-45fd-9f1a-7b22b1f4e295", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/initrd", Bootrom:"", CPUs:2, Memory:4000, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:05:29.336988    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:29 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/hyperkit.pid", "-c", "2", "-m", "4000M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "5cb961c6-b13a-45fd-9f1a-7b22b1f4e295", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/addons-540000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addon
s-540000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=addons-540000"}
	I0831 15:05:29.337021    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:29 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/hyperkit.pid -c 2 -m 4000M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 5cb961c6-b13a-45fd-9f1a-7b22b1f4e295 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/addons-540000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=addons-540000"
	I0831 15:05:29.337033    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:29 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:05:29.339839    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:29 DEBUG: hyperkit: Pid is 1576
	I0831 15:05:29.340227    1563 main.go:141] libmachine: (addons-540000) DBG | Attempt 0
	I0831 15:05:29.340240    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:05:29.340318    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:05:29.341162    1563 main.go:141] libmachine: (addons-540000) DBG | Searching for 3a:82:3b:14:54:13 in /var/db/dhcpd_leases ...
	I0831 15:05:29.357365    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:29 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:05:29.415997    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:29 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:05:29.416667    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:05:29.416689    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:05:29.416705    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:05:29.416722    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:05:29.939966    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:29 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:05:29.939985    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:29 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:05:30.055770    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:30 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:05:30.055792    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:30 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:05:30.055840    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:30 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:05:30.055857    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:30 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:05:30.056680    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:30 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:05:30.056692    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:30 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:05:31.342155    1563 main.go:141] libmachine: (addons-540000) DBG | Attempt 1
	I0831 15:05:31.342178    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:05:31.342315    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:05:31.343249    1563 main.go:141] libmachine: (addons-540000) DBG | Searching for 3a:82:3b:14:54:13 in /var/db/dhcpd_leases ...
	I0831 15:05:33.344180    1563 main.go:141] libmachine: (addons-540000) DBG | Attempt 2
	I0831 15:05:33.344196    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:05:33.344270    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:05:33.344992    1563 main.go:141] libmachine: (addons-540000) DBG | Searching for 3a:82:3b:14:54:13 in /var/db/dhcpd_leases ...
	I0831 15:05:35.345251    1563 main.go:141] libmachine: (addons-540000) DBG | Attempt 3
	I0831 15:05:35.345270    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:05:35.345353    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:05:35.346099    1563 main.go:141] libmachine: (addons-540000) DBG | Searching for 3a:82:3b:14:54:13 in /var/db/dhcpd_leases ...
	I0831 15:05:35.629056    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:35 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:05:35.629131    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:35 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:05:35.629140    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:35 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:05:35.647737    1563 main.go:141] libmachine: (addons-540000) DBG | 2024/08/31 15:05:35 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:05:37.346732    1563 main.go:141] libmachine: (addons-540000) DBG | Attempt 4
	I0831 15:05:37.346748    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:05:37.346845    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:05:37.347559    1563 main.go:141] libmachine: (addons-540000) DBG | Searching for 3a:82:3b:14:54:13 in /var/db/dhcpd_leases ...
	I0831 15:05:39.348366    1563 main.go:141] libmachine: (addons-540000) DBG | Attempt 5
	I0831 15:05:39.348392    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:05:39.348595    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:05:39.349944    1563 main.go:141] libmachine: (addons-540000) DBG | Searching for 3a:82:3b:14:54:13 in /var/db/dhcpd_leases ...
	I0831 15:05:39.350063    1563 main.go:141] libmachine: (addons-540000) DBG | Found 1 entries in /var/db/dhcpd_leases!
	I0831 15:05:39.350087    1563 main.go:141] libmachine: (addons-540000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:05:39.350102    1563 main.go:141] libmachine: (addons-540000) DBG | Found match: 3a:82:3b:14:54:13
	I0831 15:05:39.350113    1563 main.go:141] libmachine: (addons-540000) DBG | IP: 192.169.0.2
	I0831 15:05:39.350183    1563 main.go:141] libmachine: (addons-540000) Calling .GetConfigRaw
	I0831 15:05:39.351163    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:05:39.351337    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:05:39.351458    1563 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0831 15:05:39.351476    1563 main.go:141] libmachine: (addons-540000) Calling .GetState
	I0831 15:05:39.351580    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:05:39.351659    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:05:39.352613    1563 main.go:141] libmachine: Detecting operating system of created instance...
	I0831 15:05:39.352628    1563 main.go:141] libmachine: Waiting for SSH to be available...
	I0831 15:05:39.352637    1563 main.go:141] libmachine: Getting to WaitForSSH function...
	I0831 15:05:39.352643    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:05:39.352775    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:05:39.352891    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:05:39.353028    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:05:39.353149    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:05:39.353886    1563 main.go:141] libmachine: Using SSH client type: native
	I0831 15:05:39.354043    1563 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x113e4ea0] 0x113e7c00 <nil>  [] 0s} 192.169.0.2 22 <nil> <nil>}
	I0831 15:05:39.354051    1563 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0831 15:05:40.374010    1563 main.go:141] libmachine: Error dialing TCP: ssh: handshake failed: ssh: unable to authenticate, attempted methods [none publickey], no supported methods remain
	I0831 15:05:43.436981    1563 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:05:43.436993    1563 main.go:141] libmachine: Detecting the provisioner...
	I0831 15:05:43.436999    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:05:43.437137    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:05:43.437250    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:05:43.437346    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:05:43.437442    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:05:43.437568    1563 main.go:141] libmachine: Using SSH client type: native
	I0831 15:05:43.437713    1563 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x113e4ea0] 0x113e7c00 <nil>  [] 0s} 192.169.0.2 22 <nil> <nil>}
	I0831 15:05:43.437720    1563 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0831 15:05:43.497664    1563 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0831 15:05:43.497721    1563 main.go:141] libmachine: found compatible host: buildroot
	I0831 15:05:43.497728    1563 main.go:141] libmachine: Provisioning with buildroot...
	I0831 15:05:43.497734    1563 main.go:141] libmachine: (addons-540000) Calling .GetMachineName
	I0831 15:05:43.497865    1563 buildroot.go:166] provisioning hostname "addons-540000"
	I0831 15:05:43.497873    1563 main.go:141] libmachine: (addons-540000) Calling .GetMachineName
	I0831 15:05:43.497966    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:05:43.498055    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:05:43.498139    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:05:43.498224    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:05:43.498336    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:05:43.498482    1563 main.go:141] libmachine: Using SSH client type: native
	I0831 15:05:43.498625    1563 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x113e4ea0] 0x113e7c00 <nil>  [] 0s} 192.169.0.2 22 <nil> <nil>}
	I0831 15:05:43.498633    1563 main.go:141] libmachine: About to run SSH command:
	sudo hostname addons-540000 && echo "addons-540000" | sudo tee /etc/hostname
	I0831 15:05:43.570045    1563 main.go:141] libmachine: SSH cmd err, output: <nil>: addons-540000
	
	I0831 15:05:43.570064    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:05:43.570201    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:05:43.570315    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:05:43.570416    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:05:43.570495    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:05:43.570611    1563 main.go:141] libmachine: Using SSH client type: native
	I0831 15:05:43.570748    1563 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x113e4ea0] 0x113e7c00 <nil>  [] 0s} 192.169.0.2 22 <nil> <nil>}
	I0831 15:05:43.570759    1563 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\saddons-540000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 addons-540000/g' /etc/hosts;
				else 
					echo '127.0.1.1 addons-540000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:05:43.638137    1563 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:05:43.638156    1563 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:05:43.638167    1563 buildroot.go:174] setting up certificates
	I0831 15:05:43.638182    1563 provision.go:84] configureAuth start
	I0831 15:05:43.638189    1563 main.go:141] libmachine: (addons-540000) Calling .GetMachineName
	I0831 15:05:43.638334    1563 main.go:141] libmachine: (addons-540000) Calling .GetIP
	I0831 15:05:43.638433    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:05:43.638521    1563 provision.go:143] copyHostCerts
	I0831 15:05:43.638630    1563 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:05:43.638931    1563 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:05:43.639130    1563 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:05:43.639313    1563 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.addons-540000 san=[127.0.0.1 192.169.0.2 addons-540000 localhost minikube]
	I0831 15:05:43.895724    1563 provision.go:177] copyRemoteCerts
	I0831 15:05:43.895793    1563 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:05:43.895810    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:05:43.895970    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:05:43.896073    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:05:43.896179    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:05:43.896282    1563 sshutil.go:53] new ssh client: &{IP:192.169.0.2 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/id_rsa Username:docker}
	I0831 15:05:43.933012    1563 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:05:43.957656    1563 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:05:43.976804    1563 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0831 15:05:43.995952    1563 provision.go:87] duration metric: took 357.750221ms to configureAuth
	I0831 15:05:43.995972    1563 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:05:43.996129    1563 config.go:182] Loaded profile config "addons-540000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:05:43.996148    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:05:43.996288    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:05:43.996380    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:05:43.996460    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:05:43.996544    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:05:43.996621    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:05:43.996726    1563 main.go:141] libmachine: Using SSH client type: native
	I0831 15:05:43.996848    1563 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x113e4ea0] 0x113e7c00 <nil>  [] 0s} 192.169.0.2 22 <nil> <nil>}
	I0831 15:05:43.996855    1563 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:05:44.056187    1563 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:05:44.056201    1563 buildroot.go:70] root file system type: tmpfs
	I0831 15:05:44.056288    1563 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:05:44.056301    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:05:44.056446    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:05:44.056547    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:05:44.056629    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:05:44.056716    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:05:44.056877    1563 main.go:141] libmachine: Using SSH client type: native
	I0831 15:05:44.057006    1563 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x113e4ea0] 0x113e7c00 <nil>  [] 0s} 192.169.0.2 22 <nil> <nil>}
	I0831 15:05:44.057051    1563 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:05:44.128017    1563 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:05:44.128041    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:05:44.128182    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:05:44.128291    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:05:44.128436    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:05:44.128531    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:05:44.128661    1563 main.go:141] libmachine: Using SSH client type: native
	I0831 15:05:44.128795    1563 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x113e4ea0] 0x113e7c00 <nil>  [] 0s} 192.169.0.2 22 <nil> <nil>}
	I0831 15:05:44.128807    1563 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:05:45.677177    1563 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:05:45.677194    1563 main.go:141] libmachine: Checking connection to Docker...
	I0831 15:05:45.677200    1563 main.go:141] libmachine: (addons-540000) Calling .GetURL
	I0831 15:05:45.677342    1563 main.go:141] libmachine: Docker is up and running!
	I0831 15:05:45.677351    1563 main.go:141] libmachine: Reticulating splines...
	I0831 15:05:45.677355    1563 client.go:171] duration metric: took 17.590733624s to LocalClient.Create
	I0831 15:05:45.677367    1563 start.go:167] duration metric: took 17.590775911s to libmachine.API.Create "addons-540000"
	I0831 15:05:45.677378    1563 start.go:293] postStartSetup for "addons-540000" (driver="hyperkit")
	I0831 15:05:45.677385    1563 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:05:45.677395    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:05:45.677541    1563 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:05:45.677556    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:05:45.677644    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:05:45.677732    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:05:45.677833    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:05:45.677938    1563 sshutil.go:53] new ssh client: &{IP:192.169.0.2 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/id_rsa Username:docker}
	I0831 15:05:45.712973    1563 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:05:45.716092    1563 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:05:45.716108    1563 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:05:45.716209    1563 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:05:45.716258    1563 start.go:296] duration metric: took 38.874291ms for postStartSetup
	I0831 15:05:45.716284    1563 main.go:141] libmachine: (addons-540000) Calling .GetConfigRaw
	I0831 15:05:45.716874    1563 main.go:141] libmachine: (addons-540000) Calling .GetIP
	I0831 15:05:45.717009    1563 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/config.json ...
	I0831 15:05:45.717353    1563 start.go:128] duration metric: took 17.66271056s to createHost
	I0831 15:05:45.717370    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:05:45.717462    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:05:45.717537    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:05:45.717619    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:05:45.717701    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:05:45.717812    1563 main.go:141] libmachine: Using SSH client type: native
	I0831 15:05:45.717939    1563 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x113e4ea0] 0x113e7c00 <nil>  [] 0s} 192.169.0.2 22 <nil> <nil>}
	I0831 15:05:45.717947    1563 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:05:45.776562    1563 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725141945.835489445
	
	I0831 15:05:45.776573    1563 fix.go:216] guest clock: 1725141945.835489445
	I0831 15:05:45.776578    1563 fix.go:229] Guest: 2024-08-31 15:05:45.835489445 -0700 PDT Remote: 2024-08-31 15:05:45.717361 -0700 PDT m=+18.088159558 (delta=118.128445ms)
	I0831 15:05:45.776596    1563 fix.go:200] guest clock delta is within tolerance: 118.128445ms
	I0831 15:05:45.776600    1563 start.go:83] releasing machines lock for "addons-540000", held for 17.722101719s
	I0831 15:05:45.776620    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:05:45.776748    1563 main.go:141] libmachine: (addons-540000) Calling .GetIP
	I0831 15:05:45.776852    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:05:45.777143    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:05:45.777245    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:05:45.777319    1563 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:05:45.777352    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:05:45.777389    1563 ssh_runner.go:195] Run: cat /version.json
	I0831 15:05:45.777401    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:05:45.777443    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:05:45.777501    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:05:45.777529    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:05:45.777592    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:05:45.777605    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:05:45.777684    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:05:45.777712    1563 sshutil.go:53] new ssh client: &{IP:192.169.0.2 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/id_rsa Username:docker}
	I0831 15:05:45.777776    1563 sshutil.go:53] new ssh client: &{IP:192.169.0.2 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/id_rsa Username:docker}
	I0831 15:05:45.809993    1563 ssh_runner.go:195] Run: systemctl --version
	I0831 15:05:45.860453    1563 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0831 15:05:45.865663    1563 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:05:45.865701    1563 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:05:45.880420    1563 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:05:45.880435    1563 start.go:495] detecting cgroup driver to use...
	I0831 15:05:45.880538    1563 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:05:45.895513    1563 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:05:45.904624    1563 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:05:45.913718    1563 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:05:45.913764    1563 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:05:45.922704    1563 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:05:45.931656    1563 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:05:45.940891    1563 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:05:45.953278    1563 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:05:45.970021    1563 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:05:45.983536    1563 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:05:45.997402    1563 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:05:46.008291    1563 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:05:46.016597    1563 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:05:46.024550    1563 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:05:46.120557    1563 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:05:46.139286    1563 start.go:495] detecting cgroup driver to use...
	I0831 15:05:46.139367    1563 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:05:46.155029    1563 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:05:46.167434    1563 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:05:46.180105    1563 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:05:46.190528    1563 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:05:46.200729    1563 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:05:46.224180    1563 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:05:46.235681    1563 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:05:46.250678    1563 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:05:46.253780    1563 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:05:46.261068    1563 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:05:46.274374    1563 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:05:46.373405    1563 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:05:46.489876    1563 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:05:46.489940    1563 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:05:46.504594    1563 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:05:46.596989    1563 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:05:48.887260    1563 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.290219409s)
	I0831 15:05:48.887322    1563 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:05:48.899538    1563 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:05:48.912126    1563 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:05:48.922192    1563 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:05:49.022769    1563 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:05:49.126715    1563 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:05:49.237354    1563 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:05:49.252035    1563 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:05:49.262088    1563 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:05:49.357665    1563 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:05:49.418611    1563 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:05:49.418697    1563 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:05:49.423295    1563 start.go:563] Will wait 60s for crictl version
	I0831 15:05:49.423348    1563 ssh_runner.go:195] Run: which crictl
	I0831 15:05:49.426567    1563 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:05:49.456432    1563 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:05:49.456506    1563 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:05:49.472916    1563 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:05:49.515810    1563 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:05:49.515863    1563 main.go:141] libmachine: (addons-540000) Calling .GetIP
	I0831 15:05:49.516243    1563 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:05:49.520820    1563 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:05:49.531758    1563 kubeadm.go:883] updating cluster {Name:addons-540000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:4000 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1
.31.0 ClusterName:addons-540000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.2 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountTyp
e:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0831 15:05:49.531815    1563 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:05:49.531875    1563 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 15:05:49.543572    1563 docker.go:685] Got preloaded images: 
	I0831 15:05:49.543587    1563 docker.go:691] registry.k8s.io/kube-apiserver:v1.31.0 wasn't preloaded
	I0831 15:05:49.543638    1563 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0831 15:05:49.552192    1563 ssh_runner.go:195] Run: which lz4
	I0831 15:05:49.555085    1563 ssh_runner.go:195] Run: stat -c "%s %y" /preloaded.tar.lz4
	I0831 15:05:49.558147    1563 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%s %y" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0831 15:05:49.558162    1563 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (342554258 bytes)
	I0831 15:05:50.381585    1563 docker.go:649] duration metric: took 826.528412ms to copy over tarball
	I0831 15:05:50.381648    1563 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I0831 15:05:52.794744    1563 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.413046978s)
	I0831 15:05:52.794760    1563 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0831 15:05:52.819892    1563 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0831 15:05:52.827786    1563 ssh_runner.go:362] scp memory --> /var/lib/docker/image/overlay2/repositories.json (2631 bytes)
	I0831 15:05:52.842099    1563 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:05:52.940970    1563 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:05:55.503906    1563 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.562882989s)
	I0831 15:05:55.503994    1563 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 15:05:55.526184    1563 docker.go:685] Got preloaded images: -- stdout --
	registry.k8s.io/kube-controller-manager:v1.31.0
	registry.k8s.io/kube-scheduler:v1.31.0
	registry.k8s.io/kube-apiserver:v1.31.0
	registry.k8s.io/kube-proxy:v1.31.0
	registry.k8s.io/etcd:3.5.15-0
	registry.k8s.io/pause:3.10
	registry.k8s.io/coredns/coredns:v1.11.1
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0831 15:05:55.526206    1563 cache_images.go:84] Images are preloaded, skipping loading
	I0831 15:05:55.526220    1563 kubeadm.go:934] updating node { 192.169.0.2 8443 v1.31.0 docker true true} ...
	I0831 15:05:55.526316    1563 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=addons-540000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:addons-540000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:05:55.526378    1563 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0831 15:05:55.590244    1563 cni.go:84] Creating CNI manager for ""
	I0831 15:05:55.590261    1563 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0831 15:05:55.590280    1563 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0831 15:05:55.590293    1563 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.2 APIServerPort:8443 KubernetesVersion:v1.31.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:addons-540000 NodeName:addons-540000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernet
es/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0831 15:05:55.590368    1563 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.2
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "addons-540000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.2
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.2"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.31.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0831 15:05:55.590428    1563 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:05:55.603193    1563 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:05:55.603247    1563 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I0831 15:05:55.615817    1563 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:05:55.637405    1563 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:05:55.659820    1563 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2152 bytes)
	I0831 15:05:55.675069    1563 ssh_runner.go:195] Run: grep 192.169.0.2	control-plane.minikube.internal$ /etc/hosts
	I0831 15:05:55.679046    1563 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.2	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:05:55.691090    1563 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:05:55.814695    1563 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:05:55.833454    1563 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000 for IP: 192.169.0.2
	I0831 15:05:55.833468    1563 certs.go:194] generating shared ca certs ...
	I0831 15:05:55.833480    1563 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:05:55.833637    1563 certs.go:240] generating "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:05:56.019730    1563 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt ...
	I0831 15:05:56.019746    1563 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt: {Name:mkd89666e63047a0a9dff64d7468bcc0b84c03cb Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:05:56.020062    1563 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key ...
	I0831 15:05:56.020071    1563 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key: {Name:mk46adc30c7c10c441ef421d0632e0e0f536def4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:05:56.020323    1563 certs.go:240] generating "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:05:56.130015    1563 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt ...
	I0831 15:05:56.130032    1563 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt: {Name:mk3507d070c05d5358f4c785d1fb18b44935deb8 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:05:56.130353    1563 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key ...
	I0831 15:05:56.130361    1563 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key: {Name:mk1ad9264765cc72a7cadbd71d2d0c61f22df06b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:05:56.130580    1563 certs.go:256] generating profile certs ...
	I0831 15:05:56.130630    1563 certs.go:363] generating signed profile cert for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.key
	I0831 15:05:56.130648    1563 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt with IP's: []
	I0831 15:05:56.252237    1563 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt ...
	I0831 15:05:56.252322    1563 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: {Name:mk8052de448831cc7af5b9f3fe93afb2170e13fe Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:05:56.252868    1563 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.key ...
	I0831 15:05:56.252883    1563 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.key: {Name:mkfc55bf420dd6c0cad55ca873973320d4ae8990 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:05:56.253125    1563 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/apiserver.key.eca08ac2
	I0831 15:05:56.253152    1563 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/apiserver.crt.eca08ac2 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.2]
	I0831 15:05:56.410169    1563 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/apiserver.crt.eca08ac2 ...
	I0831 15:05:56.410185    1563 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/apiserver.crt.eca08ac2: {Name:mk1f92778ecca3854d69ac4933dd200c831cbf67 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:05:56.410464    1563 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/apiserver.key.eca08ac2 ...
	I0831 15:05:56.410473    1563 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/apiserver.key.eca08ac2: {Name:mkf922e43baf638cdb5e8f292f54a6d152b4af6a Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:05:56.410681    1563 certs.go:381] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/apiserver.crt.eca08ac2 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/apiserver.crt
	I0831 15:05:56.411101    1563 certs.go:385] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/apiserver.key.eca08ac2 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/apiserver.key
	I0831 15:05:56.411276    1563 certs.go:363] generating signed profile cert for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/proxy-client.key
	I0831 15:05:56.411296    1563 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/proxy-client.crt with IP's: []
	I0831 15:05:56.531708    1563 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/proxy-client.crt ...
	I0831 15:05:56.531724    1563 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/proxy-client.crt: {Name:mk42ec3fbd45be5572f15a28b564e65c7c0e8760 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:05:56.532018    1563 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/proxy-client.key ...
	I0831 15:05:56.532026    1563 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/proxy-client.key: {Name:mked06a17557dad0e64552377f9e2063bcab79c5 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:05:56.532483    1563 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:05:56.532538    1563 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:05:56.532585    1563 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:05:56.532630    1563 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:05:56.533147    1563 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:05:56.571663    1563 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:05:56.606183    1563 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:05:56.630399    1563 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:05:56.656716    1563 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1419 bytes)
	I0831 15:05:56.683129    1563 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0831 15:05:56.709343    1563 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:05:56.733932    1563 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0831 15:05:56.760837    1563 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:05:56.782978    1563 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0831 15:05:56.796991    1563 ssh_runner.go:195] Run: openssl version
	I0831 15:05:56.801917    1563 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:05:56.811668    1563 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:05:56.815643    1563 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:05:56.815680    1563 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:05:56.820605    1563 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:05:56.831490    1563 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:05:56.835647    1563 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 15:05:56.835687    1563 kubeadm.go:392] StartCluster: {Name:addons-540000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:4000 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31
.0 ClusterName:addons-540000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.2 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9
p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:05:56.835777    1563 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0831 15:05:56.851832    1563 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0831 15:05:56.862372    1563 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0831 15:05:56.872550    1563 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0831 15:05:56.882998    1563 kubeadm.go:155] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0831 15:05:56.883009    1563 kubeadm.go:157] found existing configuration files:
	
	I0831 15:05:56.883045    1563 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0831 15:05:56.892988    1563 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I0831 15:05:56.893032    1563 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I0831 15:05:56.903021    1563 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0831 15:05:56.913310    1563 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0831 15:05:56.913355    1563 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I0831 15:05:56.923808    1563 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0831 15:05:56.932057    1563 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0831 15:05:56.932106    1563 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0831 15:05:56.942477    1563 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0831 15:05:56.952148    1563 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0831 15:05:56.952187    1563 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0831 15:05:56.961603    1563 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem"
	I0831 15:05:57.000176    1563 kubeadm.go:310] [init] Using Kubernetes version: v1.31.0
	I0831 15:05:57.000285    1563 kubeadm.go:310] [preflight] Running pre-flight checks
	I0831 15:05:57.094608    1563 kubeadm.go:310] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0831 15:05:57.094702    1563 kubeadm.go:310] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0831 15:05:57.094787    1563 kubeadm.go:310] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I0831 15:05:57.106659    1563 kubeadm.go:310] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0831 15:05:57.169608    1563 out.go:235]   - Generating certificates and keys ...
	I0831 15:05:57.169668    1563 kubeadm.go:310] [certs] Using existing ca certificate authority
	I0831 15:05:57.169728    1563 kubeadm.go:310] [certs] Using existing apiserver certificate and key on disk
	I0831 15:05:57.389490    1563 kubeadm.go:310] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0831 15:05:57.707738    1563 kubeadm.go:310] [certs] Generating "front-proxy-ca" certificate and key
	I0831 15:05:57.816599    1563 kubeadm.go:310] [certs] Generating "front-proxy-client" certificate and key
	I0831 15:05:58.012033    1563 kubeadm.go:310] [certs] Generating "etcd/ca" certificate and key
	I0831 15:05:58.409983    1563 kubeadm.go:310] [certs] Generating "etcd/server" certificate and key
	I0831 15:05:58.410089    1563 kubeadm.go:310] [certs] etcd/server serving cert is signed for DNS names [addons-540000 localhost] and IPs [192.169.0.2 127.0.0.1 ::1]
	I0831 15:05:58.719776    1563 kubeadm.go:310] [certs] Generating "etcd/peer" certificate and key
	I0831 15:05:58.719897    1563 kubeadm.go:310] [certs] etcd/peer serving cert is signed for DNS names [addons-540000 localhost] and IPs [192.169.0.2 127.0.0.1 ::1]
	I0831 15:05:58.786733    1563 kubeadm.go:310] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0831 15:05:59.086451    1563 kubeadm.go:310] [certs] Generating "apiserver-etcd-client" certificate and key
	I0831 15:05:59.245339    1563 kubeadm.go:310] [certs] Generating "sa" key and public key
	I0831 15:05:59.245530    1563 kubeadm.go:310] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0831 15:05:59.364917    1563 kubeadm.go:310] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0831 15:05:59.785802    1563 kubeadm.go:310] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I0831 15:06:00.063321    1563 kubeadm.go:310] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0831 15:06:00.400384    1563 kubeadm.go:310] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0831 15:06:00.485278    1563 kubeadm.go:310] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0831 15:06:00.485984    1563 kubeadm.go:310] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0831 15:06:00.489117    1563 kubeadm.go:310] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0831 15:06:00.533887    1563 out.go:235]   - Booting up control plane ...
	I0831 15:06:00.534000    1563 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0831 15:06:00.534082    1563 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0831 15:06:00.534138    1563 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0831 15:06:00.534245    1563 kubeadm.go:310] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0831 15:06:00.534332    1563 kubeadm.go:310] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0831 15:06:00.534371    1563 kubeadm.go:310] [kubelet-start] Starting the kubelet
	I0831 15:06:00.636392    1563 kubeadm.go:310] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I0831 15:06:00.636495    1563 kubeadm.go:310] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I0831 15:06:01.137324    1563 kubeadm.go:310] [kubelet-check] The kubelet is healthy after 501.366059ms
	I0831 15:06:01.137397    1563 kubeadm.go:310] [api-check] Waiting for a healthy API server. This can take up to 4m0s
	I0831 15:06:05.637504    1563 kubeadm.go:310] [api-check] The API server is healthy after 4.502768223s
	I0831 15:06:05.652281    1563 kubeadm.go:310] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0831 15:06:05.660587    1563 kubeadm.go:310] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0831 15:06:05.679939    1563 kubeadm.go:310] [upload-certs] Skipping phase. Please see --upload-certs
	I0831 15:06:05.680110    1563 kubeadm.go:310] [mark-control-plane] Marking the node addons-540000 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0831 15:06:05.688309    1563 kubeadm.go:310] [bootstrap-token] Using token: 6aoud8.pz1tgzbomwzcn05o
	I0831 15:06:05.726941    1563 out.go:235]   - Configuring RBAC rules ...
	I0831 15:06:05.727146    1563 kubeadm.go:310] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0831 15:06:05.760854    1563 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0831 15:06:05.767862    1563 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0831 15:06:05.771411    1563 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0831 15:06:05.774340    1563 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0831 15:06:05.777072    1563 kubeadm.go:310] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0831 15:06:06.046756    1563 kubeadm.go:310] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0831 15:06:06.459140    1563 kubeadm.go:310] [addons] Applied essential addon: CoreDNS
	I0831 15:06:07.045002    1563 kubeadm.go:310] [addons] Applied essential addon: kube-proxy
	I0831 15:06:07.045552    1563 kubeadm.go:310] 
	I0831 15:06:07.045610    1563 kubeadm.go:310] Your Kubernetes control-plane has initialized successfully!
	I0831 15:06:07.045617    1563 kubeadm.go:310] 
	I0831 15:06:07.045696    1563 kubeadm.go:310] To start using your cluster, you need to run the following as a regular user:
	I0831 15:06:07.045706    1563 kubeadm.go:310] 
	I0831 15:06:07.045733    1563 kubeadm.go:310]   mkdir -p $HOME/.kube
	I0831 15:06:07.045790    1563 kubeadm.go:310]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0831 15:06:07.045836    1563 kubeadm.go:310]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0831 15:06:07.045846    1563 kubeadm.go:310] 
	I0831 15:06:07.045897    1563 kubeadm.go:310] Alternatively, if you are the root user, you can run:
	I0831 15:06:07.045906    1563 kubeadm.go:310] 
	I0831 15:06:07.045944    1563 kubeadm.go:310]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0831 15:06:07.045948    1563 kubeadm.go:310] 
	I0831 15:06:07.046002    1563 kubeadm.go:310] You should now deploy a pod network to the cluster.
	I0831 15:06:07.046067    1563 kubeadm.go:310] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0831 15:06:07.046122    1563 kubeadm.go:310]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0831 15:06:07.046128    1563 kubeadm.go:310] 
	I0831 15:06:07.046197    1563 kubeadm.go:310] You can now join any number of control-plane nodes by copying certificate authorities
	I0831 15:06:07.046262    1563 kubeadm.go:310] and service account keys on each node and then running the following as root:
	I0831 15:06:07.046269    1563 kubeadm.go:310] 
	I0831 15:06:07.046334    1563 kubeadm.go:310]   kubeadm join control-plane.minikube.internal:8443 --token 6aoud8.pz1tgzbomwzcn05o \
	I0831 15:06:07.046417    1563 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 \
	I0831 15:06:07.046436    1563 kubeadm.go:310] 	--control-plane 
	I0831 15:06:07.046442    1563 kubeadm.go:310] 
	I0831 15:06:07.046506    1563 kubeadm.go:310] Then you can join any number of worker nodes by running the following on each as root:
	I0831 15:06:07.046512    1563 kubeadm.go:310] 
	I0831 15:06:07.046578    1563 kubeadm.go:310] kubeadm join control-plane.minikube.internal:8443 --token 6aoud8.pz1tgzbomwzcn05o \
	I0831 15:06:07.046661    1563 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 
	I0831 15:06:07.047341    1563 kubeadm.go:310] W0831 22:05:57.064037    1534 common.go:101] your configuration file uses a deprecated API spec: "kubeadm.k8s.io/v1beta3" (kind: "ClusterConfiguration"). Please use 'kubeadm config migrate --old-config old.yaml --new-config new.yaml', which will write the new, similar spec using a newer API version.
	I0831 15:06:07.047575    1563 kubeadm.go:310] W0831 22:05:57.064596    1534 common.go:101] your configuration file uses a deprecated API spec: "kubeadm.k8s.io/v1beta3" (kind: "InitConfiguration"). Please use 'kubeadm config migrate --old-config old.yaml --new-config new.yaml', which will write the new, similar spec using a newer API version.
	I0831 15:06:07.047676    1563 kubeadm.go:310] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0831 15:06:07.047687    1563 cni.go:84] Creating CNI manager for ""
	I0831 15:06:07.047696    1563 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0831 15:06:07.072690    1563 out.go:177] * Configuring bridge CNI (Container Networking Interface) ...
	I0831 15:06:07.115561    1563 ssh_runner.go:195] Run: sudo mkdir -p /etc/cni/net.d
	I0831 15:06:07.125111    1563 ssh_runner.go:362] scp memory --> /etc/cni/net.d/1-k8s.conflist (496 bytes)
	I0831 15:06:07.140699    1563 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0831 15:06:07.140752    1563 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:06:07.140759    1563 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes addons-540000 minikube.k8s.io/updated_at=2024_08_31T15_06_07_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2 minikube.k8s.io/name=addons-540000 minikube.k8s.io/primary=true
	I0831 15:06:07.151731    1563 ops.go:34] apiserver oom_adj: -16
	I0831 15:06:07.246512    1563 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:06:07.747762    1563 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:06:08.247386    1563 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:06:08.746864    1563 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:06:09.246664    1563 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:06:09.746788    1563 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:06:10.246679    1563 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:06:10.747754    1563 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:06:11.246851    1563 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:06:11.746778    1563 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:06:11.837793    1563 kubeadm.go:1113] duration metric: took 4.697022874s to wait for elevateKubeSystemPrivileges
	I0831 15:06:11.837814    1563 kubeadm.go:394] duration metric: took 15.001931326s to StartCluster
	I0831 15:06:11.837831    1563 settings.go:142] acquiring lock: {Name:mk4b1b0a7439feab82be8f6d66b4d3c4d11c9b5f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:06:11.837994    1563 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:06:11.838311    1563 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/kubeconfig: {Name:mkc7259a3f17d77b84078e55eed4ed8b5d2486ca Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:06:11.838592    1563 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0831 15:06:11.838622    1563 start.go:235] Will wait 6m0s for node &{Name: IP:192.169.0.2 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:06:11.838649    1563 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:true csi-hostpath-driver:true dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:true gvisor:false headlamp:false helm-tiller:true inaccel:false ingress:true ingress-dns:true inspektor-gadget:true istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:true nvidia-device-plugin:true nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:true registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:true volcano:true volumesnapshots:true yakd:true]
	I0831 15:06:11.838717    1563 addons.go:69] Setting yakd=true in profile "addons-540000"
	I0831 15:06:11.838723    1563 config.go:182] Loaded profile config "addons-540000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:06:11.838744    1563 addons.go:69] Setting gcp-auth=true in profile "addons-540000"
	I0831 15:06:11.838739    1563 addons.go:69] Setting inspektor-gadget=true in profile "addons-540000"
	I0831 15:06:11.838756    1563 addons.go:69] Setting metrics-server=true in profile "addons-540000"
	I0831 15:06:11.838769    1563 mustload.go:65] Loading cluster: addons-540000
	I0831 15:06:11.838788    1563 addons.go:69] Setting nvidia-device-plugin=true in profile "addons-540000"
	I0831 15:06:11.838789    1563 addons.go:69] Setting cloud-spanner=true in profile "addons-540000"
	I0831 15:06:11.838796    1563 addons.go:69] Setting ingress-dns=true in profile "addons-540000"
	I0831 15:06:11.838803    1563 addons.go:234] Setting addon nvidia-device-plugin=true in "addons-540000"
	I0831 15:06:11.838806    1563 addons.go:69] Setting registry=true in profile "addons-540000"
	I0831 15:06:11.838805    1563 addons.go:69] Setting volumesnapshots=true in profile "addons-540000"
	I0831 15:06:11.838781    1563 addons.go:69] Setting storage-provisioner=true in profile "addons-540000"
	I0831 15:06:11.838827    1563 host.go:66] Checking if "addons-540000" exists ...
	I0831 15:06:11.838834    1563 addons.go:234] Setting addon volumesnapshots=true in "addons-540000"
	I0831 15:06:11.838840    1563 addons.go:234] Setting addon storage-provisioner=true in "addons-540000"
	I0831 15:06:11.838772    1563 addons.go:234] Setting addon metrics-server=true in "addons-540000"
	I0831 15:06:11.838864    1563 host.go:66] Checking if "addons-540000" exists ...
	I0831 15:06:11.838809    1563 addons.go:234] Setting addon cloud-spanner=true in "addons-540000"
	I0831 15:06:11.838886    1563 host.go:66] Checking if "addons-540000" exists ...
	I0831 15:06:11.838776    1563 addons.go:69] Setting storage-provisioner-rancher=true in profile "addons-540000"
	I0831 15:06:11.838902    1563 host.go:66] Checking if "addons-540000" exists ...
	I0831 15:06:11.838924    1563 addons_storage_classes.go:33] enableOrDisableStorageClasses storage-provisioner-rancher=true on "addons-540000"
	I0831 15:06:11.838767    1563 addons.go:69] Setting default-storageclass=true in profile "addons-540000"
	I0831 15:06:11.838780    1563 addons.go:69] Setting volcano=true in profile "addons-540000"
	I0831 15:06:11.838969    1563 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "addons-540000"
	I0831 15:06:11.838758    1563 addons.go:69] Setting csi-hostpath-driver=true in profile "addons-540000"
	I0831 15:06:11.839004    1563 addons.go:234] Setting addon volcano=true in "addons-540000"
	I0831 15:06:11.839013    1563 addons.go:234] Setting addon csi-hostpath-driver=true in "addons-540000"
	I0831 15:06:11.839019    1563 config.go:182] Loaded profile config "addons-540000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:06:11.839032    1563 host.go:66] Checking if "addons-540000" exists ...
	I0831 15:06:11.839049    1563 host.go:66] Checking if "addons-540000" exists ...
	I0831 15:06:11.838793    1563 addons.go:69] Setting ingress=true in profile "addons-540000"
	I0831 15:06:11.839110    1563 addons.go:234] Setting addon ingress=true in "addons-540000"
	I0831 15:06:11.839141    1563 host.go:66] Checking if "addons-540000" exists ...
	I0831 15:06:11.839199    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.839207    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.839215    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.839232    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.839238    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.839249    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.839252    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.839253    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.839273    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.839274    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.839347    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.839349    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.839382    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.838800    1563 addons.go:234] Setting addon inspektor-gadget=true in "addons-540000"
	I0831 15:06:11.839393    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.839408    1563 host.go:66] Checking if "addons-540000" exists ...
	I0831 15:06:11.838751    1563 addons.go:234] Setting addon yakd=true in "addons-540000"
	I0831 15:06:11.838798    1563 addons.go:69] Setting helm-tiller=true in profile "addons-540000"
	I0831 15:06:11.839472    1563 host.go:66] Checking if "addons-540000" exists ...
	I0831 15:06:11.839547    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.839560    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.839566    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.839568    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.839568    1563 addons.go:234] Setting addon helm-tiller=true in "addons-540000"
	I0831 15:06:11.838825    1563 addons.go:234] Setting addon ingress-dns=true in "addons-540000"
	I0831 15:06:11.838863    1563 host.go:66] Checking if "addons-540000" exists ...
	I0831 15:06:11.840914    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.840981    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.838825    1563 addons.go:234] Setting addon registry=true in "addons-540000"
	I0831 15:06:11.841139    1563 host.go:66] Checking if "addons-540000" exists ...
	I0831 15:06:11.841064    1563 host.go:66] Checking if "addons-540000" exists ...
	I0831 15:06:11.841408    1563 host.go:66] Checking if "addons-540000" exists ...
	I0831 15:06:11.841627    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.844365    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.844474    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.844711    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.844757    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.844836    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.844816    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.844937    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.844954    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.844998    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.845241    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.848646    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.853933    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49668
	I0831 15:06:11.858070    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.858276    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49670
	I0831 15:06:11.858339    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49671
	I0831 15:06:11.863114    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.863575    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.863732    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49674
	I0831 15:06:11.863811    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.863884    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.863991    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49675
	I0831 15:06:11.867756    1563 out.go:177] * Verifying Kubernetes components...
	I0831 15:06:11.868076    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49677
	I0831 15:06:11.868093    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.868095    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.886656    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.868127    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.868125    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.868190    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49678
	I0831 15:06:11.886718    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.868797    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.874786    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49681
	I0831 15:06:11.874852    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49680
	I0831 15:06:11.877190    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49682
	I0831 15:06:11.878355    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49683
	I0831 15:06:11.880119    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49684
	I0831 15:06:11.887223    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.887224    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.882216    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49685
	I0831 15:06:11.887228    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.882821    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49686
	I0831 15:06:11.887291    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.884711    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49687
	I0831 15:06:11.885300    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49688
	I0831 15:06:11.887388    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.887239    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.887364    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.887440    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.887453    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.887618    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.887640    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.887653    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.887670    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.889130    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.889617    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.889722    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.890019    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.890534    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.890577    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.891260    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.891275    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.891294    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.891309    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.891447    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.891463    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.891476    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.891477    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.891487    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.891494    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.891466    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.891507    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.891523    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.891533    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.891559    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.891467    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.891461    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.891468    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.891577    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.891704    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.891743    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.893303    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.893353    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.893401    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.893604    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.893533    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.893670    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.893717    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.893771    1563 main.go:141] libmachine: (addons-540000) Calling .GetState
	I0831 15:06:11.893823    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.893880    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.893790    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.893976    1563 main.go:141] libmachine: (addons-540000) Calling .GetState
	I0831 15:06:11.894037    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.894122    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.896734    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.898156    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.898197    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.898161    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.898221    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.898264    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.898395    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.898453    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.898553    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.898561    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:06:11.898586    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:06:11.898562    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.898593    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:06:11.899284    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.899563    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.899376    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.899552    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:06:11.899941    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.900021    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.900096    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.900152    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.900202    1563 main.go:141] libmachine: (addons-540000) Calling .GetState
	I0831 15:06:11.900291    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.900579    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:06:11.900620    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:06:11.900744    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.900969    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.901060    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.901078    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.901157    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.901179    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.901252    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.901300    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.905117    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49700
	I0831 15:06:11.905136    1563 host.go:66] Checking if "addons-540000" exists ...
	I0831 15:06:11.906149    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49701
	I0831 15:06:11.908420    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.908850    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49703
	I0831 15:06:11.909016    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.909070    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.911649    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.912506    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.913788    1563 addons.go:234] Setting addon storage-provisioner-rancher=true in "addons-540000"
	I0831 15:06:11.913885    1563 addons.go:234] Setting addon default-storageclass=true in "addons-540000"
	I0831 15:06:11.913912    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.914955    1563 host.go:66] Checking if "addons-540000" exists ...
	I0831 15:06:11.914966    1563 host.go:66] Checking if "addons-540000" exists ...
	I0831 15:06:11.915262    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49706
	I0831 15:06:11.915368    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.915703    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49707
	I0831 15:06:11.915735    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.915726    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.918104    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.918494    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.918529    1563 main.go:141] libmachine: (addons-540000) Calling .GetState
	I0831 15:06:11.919591    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.919472    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49709
	I0831 15:06:11.920091    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.921048    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:11.921330    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.921479    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:06:11.921508    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:06:11.921596    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:06:11.921603    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:11.921662    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.921734    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.921872    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.921968    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.921986    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49711
	I0831 15:06:11.921999    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.925092    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.922255    1563 main.go:141] libmachine: (addons-540000) Calling .GetState
	I0831 15:06:11.922393    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.922476    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.923415    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.926907    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49714
	I0831 15:06:11.922691    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.927347    1563 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:06:11.927777    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.927975    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:06:11.928056    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:06:11.927830    1563 main.go:141] libmachine: (addons-540000) Calling .GetState
	I0831 15:06:11.928049    1563 main.go:141] libmachine: (addons-540000) Calling .GetState
	I0831 15:06:11.928222    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.928300    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.928456    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.928499    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.928594    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:06:11.928611    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.928582    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:06:11.928659    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:06:11.928799    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:06:11.928873    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49715
	I0831 15:06:11.928885    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.929130    1563 main.go:141] libmachine: (addons-540000) Calling .GetState
	I0831 15:06:11.929145    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:06:11.929930    1563 main.go:141] libmachine: (addons-540000) Calling .GetState
	I0831 15:06:11.930272    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:06:11.930459    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:06:11.930461    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:11.930536    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:06:11.930703    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.930872    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.930880    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:06:11.931079    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:06:11.968706    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:06:11.931226    1563 main.go:141] libmachine: (addons-540000) Calling .GetState
	I0831 15:06:11.931338    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:06:11.936156    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49718
	I0831 15:06:11.936810    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:06:11.969013    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:06:11.936854    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49719
	I0831 15:06:11.938370    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49720
	I0831 15:06:11.942458    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49721
	I0831 15:06:11.942663    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49722
	I0831 15:06:11.943950    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49723
	I0831 15:06:11.946001    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49724
	I0831 15:06:11.968529    1563 out.go:177]   - Using image nvcr.io/nvidia/k8s-device-plugin:v0.16.2
	I0831 15:06:11.969152    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:06:11.969166    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.969318    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.969447    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.969453    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.970294    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:06:11.980851    1563 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.169.0.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0831 15:06:11.989340    1563 out.go:177]   - Using image registry.k8s.io/ingress-nginx/controller:v1.11.2
	I0831 15:06:11.989631    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.989736    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.990223    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.990275    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.990290    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.990302    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:11.990306    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:11.990317    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:12.084552    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:12.084575    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:11.990323    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:12.026128    1563 out.go:177]   - Using image registry.k8s.io/sig-storage/snapshot-controller:v6.1.0
	I0831 15:06:12.026879    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:12.063232    1563 out.go:177]   - Using image registry.k8s.io/metrics-server/metrics-server:v0.7.2
	I0831 15:06:12.063719    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:12.084169    1563 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0831 15:06:12.084173    1563 out.go:177]   - Using image docker.io/registry:2.8.3
	I0831 15:06:12.084935    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:12.084963    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:12.084974    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:12.084986    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:12.085007    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:12.085045    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:12.085074    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:12.110313    1563 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:06:12.121108    1563 addons.go:431] installing /etc/kubernetes/addons/nvidia-device-plugin.yaml
	I0831 15:06:12.121280    1563 main.go:141] libmachine: (addons-540000) Calling .GetState
	I0831 15:06:12.121280    1563 main.go:141] libmachine: (addons-540000) Calling .GetState
	I0831 15:06:12.142077    1563 out.go:177]   - Using image ghcr.io/helm/tiller:v2.17.0
	I0831 15:06:12.142382    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:12.142394    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:12.142398    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:12.142402    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:12.142424    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:12.142576    1563 main.go:141] libmachine: (addons-540000) Calling .GetState
	I0831 15:06:12.142596    1563 main.go:141] libmachine: (addons-540000) Calling .GetState
	I0831 15:06:12.163110    1563 addons.go:431] installing /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml
	I0831 15:06:12.163148    1563 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/nvidia-device-plugin.yaml (1966 bytes)
	I0831 15:06:12.200057    1563 out.go:177]   - Using image registry.k8s.io/ingress-nginx/kube-webhook-certgen:v1.4.3
	I0831 15:06:12.200244    1563 ssh_runner.go:362] scp volumesnapshots/csi-hostpath-snapshotclass.yaml --> /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml (934 bytes)
	I0831 15:06:12.221867    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:06:12.200280    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:06:12.200526    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:06:12.200570    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:06:12.200606    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:12.200608    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:06:12.200624    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:06:12.222224    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:06:12.200716    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:12.222249    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:06:12.222257    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:06:12.200636    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:12.200730    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:12.201681    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:06:12.200776    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:12.222354    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:06:12.201708    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:06:12.201703    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:06:12.201805    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:06:12.222055    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:06:12.222098    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:06:12.221593    1563 addons.go:431] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0831 15:06:12.222526    1563 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0831 15:06:12.222549    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:06:12.222673    1563 main.go:141] libmachine: (addons-540000) Calling .GetState
	I0831 15:06:12.222940    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:06:12.222961    1563 main.go:141] libmachine: (addons-540000) Calling .GetState
	I0831 15:06:12.222973    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:06:12.223006    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:06:12.223037    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:06:12.295462    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:06:12.223111    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:06:12.223215    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:06:12.223286    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:12.223330    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:06:12.316753    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:12.223361    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:06:12.316806    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:06:12.223555    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:12.224818    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:06:12.316866    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:12.224862    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:06:12.258380    1563 addons.go:431] installing /etc/kubernetes/addons/metrics-apiservice.yaml
	I0831 15:06:12.295246    1563 addons.go:431] installing /etc/kubernetes/addons/helm-tiller-dp.yaml
	I0831 15:06:12.390273    1563 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/helm-tiller-dp.yaml (2422 bytes)
	I0831 15:06:12.295755    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:06:12.390311    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:06:12.316312    1563 out.go:177]   - Using image gcr.io/cloud-spanner-emulator/emulator:1.5.23
	I0831 15:06:12.317038    1563 sshutil.go:53] new ssh client: &{IP:192.169.0.2 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/id_rsa Username:docker}
	I0831 15:06:12.317000    1563 sshutil.go:53] new ssh client: &{IP:192.169.0.2 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/id_rsa Username:docker}
	I0831 15:06:12.327991    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49733
	I0831 15:06:12.328405    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49734
	I0831 15:06:12.353211    1563 out.go:177]   - Using image ghcr.io/inspektor-gadget/inspektor-gadget:v0.31.0
	I0831 15:06:12.353466    1563 ssh_runner.go:362] scp metrics-server/metrics-apiservice.yaml --> /etc/kubernetes/addons/metrics-apiservice.yaml (424 bytes)
	I0831 15:06:12.390085    1563 out.go:177]   - Using image docker.io/volcanosh/vc-webhook-manager:v1.9.0
	I0831 15:06:12.390100    1563 out.go:177]   - Using image registry.k8s.io/sig-storage/hostpathplugin:v1.9.0
	I0831 15:06:12.390530    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:06:12.390529    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:06:12.411137    1563 out.go:177]   - Using image gcr.io/k8s-minikube/kube-registry-proxy:0.0.6
	I0831 15:06:12.411439    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:06:12.411686    1563 sshutil.go:53] new ssh client: &{IP:192.169.0.2 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/id_rsa Username:docker}
	I0831 15:06:12.411696    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:06:12.411889    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:12.411924    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:12.432152    1563 out.go:177]   - Using image registry.k8s.io/ingress-nginx/kube-webhook-certgen:v1.4.3
	I0831 15:06:12.432616    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:06:12.469153    1563 out.go:177]   - Using image docker.io/marcnuri/yakd:0.0.5
	I0831 15:06:12.469281    1563 addons.go:431] installing /etc/kubernetes/addons/deployment.yaml
	I0831 15:06:12.470929    1563 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/nvidia-device-plugin.yaml
	I0831 15:06:12.477433    1563 addons.go:431] installing /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml
	I0831 15:06:12.490117    1563 out.go:177]   - Using image gcr.io/k8s-minikube/minikube-ingress-dns:0.0.3
	I0831 15:06:12.490211    1563 addons.go:431] installing /etc/kubernetes/addons/registry-rc.yaml
	I0831 15:06:12.490607    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:06:12.490816    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:12.490894    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:12.511287    1563 addons.go:431] installing /etc/kubernetes/addons/ig-namespace.yaml
	I0831 15:06:12.511553    1563 ssh_runner.go:362] scp inspektor-gadget/ig-namespace.yaml --> /etc/kubernetes/addons/ig-namespace.yaml (55 bytes)
	I0831 15:06:12.511569    1563 ssh_runner.go:362] scp volumesnapshots/snapshot.storage.k8s.io_volumesnapshotclasses.yaml --> /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml (6471 bytes)
	I0831 15:06:12.511582    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:06:12.511606    1563 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/registry-rc.yaml (860 bytes)
	I0831 15:06:12.553549    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:06:12.511516    1563 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/deployment.yaml (1004 bytes)
	I0831 15:06:12.511646    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:12.574376    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:06:12.511661    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:12.511737    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:06:12.511768    1563 sshutil.go:53] new ssh client: &{IP:192.169.0.2 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/id_rsa Username:docker}
	I0831 15:06:12.511805    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:06:12.534010    1563 addons.go:431] installing /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml
	I0831 15:06:12.615484    1563 ssh_runner.go:362] scp volumesnapshots/snapshot.storage.k8s.io_volumesnapshotcontents.yaml --> /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml (23126 bytes)
	I0831 15:06:12.553205    1563 out.go:177]   - Using image docker.io/volcanosh/vc-controller-manager:v1.9.0
	I0831 15:06:12.553731    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:06:12.554630    1563 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0831 15:06:12.574233    1563 addons.go:431] installing /etc/kubernetes/addons/ingress-deploy.yaml
	I0831 15:06:12.574630    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:06:12.574675    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:06:12.574709    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:06:12.574778    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:12.574846    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:12.615230    1563 addons.go:431] installing /etc/kubernetes/addons/ingress-dns-pod.yaml
	I0831 15:06:12.673710    1563 addons.go:431] installing /etc/kubernetes/addons/yakd-ns.yaml
	I0831 15:06:12.673772    1563 ssh_runner.go:362] scp yakd/yakd-ns.yaml --> /etc/kubernetes/addons/yakd-ns.yaml (171 bytes)
	I0831 15:06:12.673802    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:06:12.673906    1563 out.go:177]   - Using image registry.k8s.io/sig-storage/livenessprobe:v2.8.0
	I0831 15:06:12.674325    1563 main.go:141] libmachine: (addons-540000) Calling .GetState
	I0831 15:06:12.674347    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:06:12.674378    1563 main.go:141] libmachine: (addons-540000) Calling .GetState
	I0831 15:06:12.674438    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:06:12.673996    1563 addons.go:431] installing /etc/kubernetes/addons/helm-tiller-rbac.yaml
	I0831 15:06:12.674529    1563 ssh_runner.go:362] scp helm-tiller/helm-tiller-rbac.yaml --> /etc/kubernetes/addons/helm-tiller-rbac.yaml (1188 bytes)
	I0831 15:06:12.674015    1563 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ingress-dns-pod.yaml (2442 bytes)
	I0831 15:06:12.674585    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:06:12.674036    1563 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ingress-deploy.yaml (16078 bytes)
	I0831 15:06:12.674623    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:06:12.674640    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:06:12.674704    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:06:12.674643    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:06:12.674264    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:06:12.674292    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:06:12.674185    1563 sshutil.go:53] new ssh client: &{IP:192.169.0.2 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/id_rsa Username:docker}
	I0831 15:06:12.674762    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:06:12.674777    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:06:12.674932    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:06:12.674963    1563 sshutil.go:53] new ssh client: &{IP:192.169.0.2 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/id_rsa Username:docker}
	I0831 15:06:12.675000    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:06:12.676090    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:06:12.676115    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:06:12.711615    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:06:12.711646    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:06:12.676706    1563 addons.go:431] installing /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml
	I0831 15:06:12.711657    1563 sshutil.go:53] new ssh client: &{IP:192.169.0.2 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/id_rsa Username:docker}
	I0831 15:06:12.711671    1563 sshutil.go:53] new ssh client: &{IP:192.169.0.2 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/id_rsa Username:docker}
	I0831 15:06:12.711682    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:06:12.711686    1563 ssh_runner.go:362] scp volumesnapshots/snapshot.storage.k8s.io_volumesnapshots.yaml --> /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml (19582 bytes)
	I0831 15:06:12.697504    1563 addons.go:431] installing /etc/kubernetes/addons/helm-tiller-svc.yaml
	I0831 15:06:12.711782    1563 ssh_runner.go:362] scp helm-tiller/helm-tiller-svc.yaml --> /etc/kubernetes/addons/helm-tiller-svc.yaml (951 bytes)
	I0831 15:06:12.711827    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:06:12.711863    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:06:12.711905    1563 addons.go:431] installing /etc/kubernetes/addons/storageclass.yaml
	I0831 15:06:12.769619    1563 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0831 15:06:12.769638    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:06:12.711906    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:06:12.732660    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:06:12.732659    1563 sshutil.go:53] new ssh client: &{IP:192.169.0.2 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/id_rsa Username:docker}
	I0831 15:06:12.769836    1563 sshutil.go:53] new ssh client: &{IP:192.169.0.2 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/id_rsa Username:docker}
	I0831 15:06:12.769902    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:06:12.783534    1563 addons.go:431] installing /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml
	I0831 15:06:12.806371    1563 ssh_runner.go:362] scp volumesnapshots/rbac-volume-snapshot-controller.yaml --> /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml (3545 bytes)
	I0831 15:06:12.799334    1563 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/helm-tiller-dp.yaml -f /etc/kubernetes/addons/helm-tiller-rbac.yaml -f /etc/kubernetes/addons/helm-tiller-svc.yaml
	I0831 15:06:12.806084    1563 out.go:177]   - Using image docker.io/volcanosh/vc-scheduler:v1.9.0
	I0831 15:06:12.827170    1563 out.go:177]   - Using image docker.io/busybox:stable
	I0831 15:06:12.848458    1563 out.go:177]   - Using image registry.k8s.io/sig-storage/csi-resizer:v1.6.0
	I0831 15:06:12.806491    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:06:12.827123    1563 addons.go:431] installing /etc/kubernetes/addons/metrics-server-deployment.yaml
	I0831 15:06:12.806425    1563 sshutil.go:53] new ssh client: &{IP:192.169.0.2 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/id_rsa Username:docker}
	I0831 15:06:12.859037    1563 start.go:971] {"host.minikube.internal": 192.169.0.1} host record injected into CoreDNS's ConfigMap
	I0831 15:06:12.859663    1563 node_ready.go:35] waiting up to 6m0s for node "addons-540000" to be "Ready" ...
	I0831 15:06:12.869253    1563 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/metrics-server-deployment.yaml (1907 bytes)
	I0831 15:06:12.869566    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:06:12.869721    1563 sshutil.go:53] new ssh client: &{IP:192.169.0.2 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/id_rsa Username:docker}
	I0831 15:06:12.869957    1563 addons.go:431] installing /etc/kubernetes/addons/volcano-deployment.yaml
	I0831 15:06:12.869969    1563 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/volcano-deployment.yaml (434001 bytes)
	I0831 15:06:12.870001    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:06:12.870182    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:06:12.870286    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:06:12.870398    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:06:12.870487    1563 sshutil.go:53] new ssh client: &{IP:192.169.0.2 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/id_rsa Username:docker}
	I0831 15:06:12.886159    1563 addons.go:431] installing /etc/kubernetes/addons/registry-svc.yaml
	I0831 15:06:12.886173    1563 ssh_runner.go:362] scp registry/registry-svc.yaml --> /etc/kubernetes/addons/registry-svc.yaml (398 bytes)
	I0831 15:06:12.933085    1563 addons.go:431] installing /etc/kubernetes/addons/ig-serviceaccount.yaml
	I0831 15:06:12.933098    1563 ssh_runner.go:362] scp inspektor-gadget/ig-serviceaccount.yaml --> /etc/kubernetes/addons/ig-serviceaccount.yaml (80 bytes)
	I0831 15:06:12.944748    1563 addons.go:431] installing /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml
	I0831 15:06:12.944760    1563 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml (1475 bytes)
	I0831 15:06:12.947884    1563 out.go:177]   - Using image docker.io/rancher/local-path-provisioner:v0.0.22
	I0831 15:06:12.947901    1563 out.go:177]   - Using image registry.k8s.io/sig-storage/csi-snapshotter:v6.1.0
	I0831 15:06:12.974572    1563 addons.go:431] installing /etc/kubernetes/addons/registry-proxy.yaml
	I0831 15:06:12.974583    1563 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/registry-proxy.yaml (947 bytes)
	I0831 15:06:12.985284    1563 addons.go:431] installing /etc/kubernetes/addons/storage-provisioner-rancher.yaml
	I0831 15:06:12.985298    1563 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner-rancher.yaml (3113 bytes)
	I0831 15:06:12.985312    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:06:12.985482    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:06:12.985580    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:06:12.985657    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:06:12.985735    1563 sshutil.go:53] new ssh client: &{IP:192.169.0.2 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/id_rsa Username:docker}
	I0831 15:06:13.000523    1563 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/deployment.yaml
	I0831 15:06:13.003290    1563 addons.go:431] installing /etc/kubernetes/addons/metrics-server-rbac.yaml
	I0831 15:06:13.003299    1563 ssh_runner.go:362] scp metrics-server/metrics-server-rbac.yaml --> /etc/kubernetes/addons/metrics-server-rbac.yaml (2175 bytes)
	I0831 15:06:13.030365    1563 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/ingress-deploy.yaml
	I0831 15:06:13.043144    1563 out.go:177]   - Using image registry.k8s.io/sig-storage/csi-provisioner:v3.3.0
	I0831 15:06:13.078670    1563 addons.go:431] installing /etc/kubernetes/addons/ig-role.yaml
	I0831 15:06:13.078686    1563 ssh_runner.go:362] scp inspektor-gadget/ig-role.yaml --> /etc/kubernetes/addons/ig-role.yaml (210 bytes)
	I0831 15:06:13.082633    1563 node_ready.go:49] node "addons-540000" has status "Ready":"True"
	I0831 15:06:13.082646    1563 node_ready.go:38] duration metric: took 213.401668ms for node "addons-540000" to be "Ready" ...
	I0831 15:06:13.082654    1563 pod_ready.go:36] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:06:13.098215    1563 addons.go:431] installing /etc/kubernetes/addons/metrics-server-service.yaml
	I0831 15:06:13.098228    1563 ssh_runner.go:362] scp metrics-server/metrics-server-service.yaml --> /etc/kubernetes/addons/metrics-server-service.yaml (446 bytes)
	I0831 15:06:13.115702    1563 addons.go:431] installing /etc/kubernetes/addons/yakd-sa.yaml
	I0831 15:06:13.115715    1563 ssh_runner.go:362] scp yakd/yakd-sa.yaml --> /etc/kubernetes/addons/yakd-sa.yaml (247 bytes)
	I0831 15:06:13.117083    1563 out.go:177]   - Using image registry.k8s.io/sig-storage/csi-attacher:v4.0.0
	I0831 15:06:13.131986    1563 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/metrics-apiservice.yaml -f /etc/kubernetes/addons/metrics-server-deployment.yaml -f /etc/kubernetes/addons/metrics-server-rbac.yaml -f /etc/kubernetes/addons/metrics-server-service.yaml
	I0831 15:06:13.150948    1563 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml
	I0831 15:06:13.157255    1563 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/registry-rc.yaml -f /etc/kubernetes/addons/registry-svc.yaml -f /etc/kubernetes/addons/registry-proxy.yaml
	I0831 15:06:13.175259    1563 out.go:177]   - Using image registry.k8s.io/sig-storage/csi-external-health-monitor-controller:v0.7.0
	I0831 15:06:13.200561    1563 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/ingress-dns-pod.yaml
	I0831 15:06:13.219410    1563 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/volcano-deployment.yaml
	I0831 15:06:13.233277    1563 out.go:177]   - Using image registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.6.0
	I0831 15:06:13.254736    1563 addons.go:431] installing /etc/kubernetes/addons/ig-rolebinding.yaml
	I0831 15:06:13.254749    1563 ssh_runner.go:362] scp inspektor-gadget/ig-rolebinding.yaml --> /etc/kubernetes/addons/ig-rolebinding.yaml (244 bytes)
	I0831 15:06:13.268168    1563 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0831 15:06:13.270163    1563 addons.go:431] installing /etc/kubernetes/addons/rbac-external-attacher.yaml
	I0831 15:06:13.270176    1563 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-external-attacher.yaml --> /etc/kubernetes/addons/rbac-external-attacher.yaml (3073 bytes)
	I0831 15:06:13.270217    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:06:13.270385    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:06:13.270543    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:06:13.270659    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:06:13.270785    1563 sshutil.go:53] new ssh client: &{IP:192.169.0.2 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/id_rsa Username:docker}
	I0831 15:06:13.279781    1563 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-4l4hf" in "kube-system" namespace to be "Ready" ...
	I0831 15:06:13.309469    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:13.309483    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:13.309656    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:13.309657    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:13.309668    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:13.309676    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:13.309682    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:13.309809    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:13.309811    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:13.309819    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:13.385993    1563 kapi.go:214] "coredns" deployment in "kube-system" namespace and "addons-540000" context rescaled to 1 replicas
	I0831 15:06:13.408341    1563 addons.go:431] installing /etc/kubernetes/addons/yakd-crb.yaml
	I0831 15:06:13.408353    1563 ssh_runner.go:362] scp yakd/yakd-crb.yaml --> /etc/kubernetes/addons/yakd-crb.yaml (422 bytes)
	I0831 15:06:13.504912    1563 addons.go:431] installing /etc/kubernetes/addons/ig-clusterrole.yaml
	I0831 15:06:13.504925    1563 ssh_runner.go:362] scp inspektor-gadget/ig-clusterrole.yaml --> /etc/kubernetes/addons/ig-clusterrole.yaml (1485 bytes)
	I0831 15:06:13.595037    1563 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner-rancher.yaml
	I0831 15:06:13.632990    1563 addons.go:431] installing /etc/kubernetes/addons/yakd-svc.yaml
	I0831 15:06:13.633003    1563 ssh_runner.go:362] scp yakd/yakd-svc.yaml --> /etc/kubernetes/addons/yakd-svc.yaml (412 bytes)
	I0831 15:06:13.779022    1563 addons.go:431] installing /etc/kubernetes/addons/ig-clusterrolebinding.yaml
	I0831 15:06:13.779035    1563 ssh_runner.go:362] scp inspektor-gadget/ig-clusterrolebinding.yaml --> /etc/kubernetes/addons/ig-clusterrolebinding.yaml (274 bytes)
	I0831 15:06:13.933348    1563 addons.go:431] installing /etc/kubernetes/addons/yakd-dp.yaml
	I0831 15:06:13.933361    1563 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/yakd-dp.yaml (2017 bytes)
	I0831 15:06:14.258899    1563 addons.go:431] installing /etc/kubernetes/addons/ig-crd.yaml
	I0831 15:06:14.258914    1563 ssh_runner.go:362] scp inspektor-gadget/ig-crd.yaml --> /etc/kubernetes/addons/ig-crd.yaml (5216 bytes)
	I0831 15:06:14.357062    1563 addons.go:431] installing /etc/kubernetes/addons/rbac-hostpath.yaml
	I0831 15:06:14.357080    1563 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-hostpath.yaml --> /etc/kubernetes/addons/rbac-hostpath.yaml (4266 bytes)
	I0831 15:06:14.688491    1563 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/yakd-ns.yaml -f /etc/kubernetes/addons/yakd-sa.yaml -f /etc/kubernetes/addons/yakd-crb.yaml -f /etc/kubernetes/addons/yakd-svc.yaml -f /etc/kubernetes/addons/yakd-dp.yaml
	I0831 15:06:14.760693    1563 addons.go:431] installing /etc/kubernetes/addons/ig-daemonset.yaml
	I0831 15:06:14.760706    1563 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ig-daemonset.yaml (7735 bytes)
	I0831 15:06:14.787791    1563 addons.go:431] installing /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml
	I0831 15:06:14.787802    1563 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-external-health-monitor-controller.yaml --> /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml (3038 bytes)
	I0831 15:06:14.918557    1563 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/ig-namespace.yaml -f /etc/kubernetes/addons/ig-serviceaccount.yaml -f /etc/kubernetes/addons/ig-role.yaml -f /etc/kubernetes/addons/ig-rolebinding.yaml -f /etc/kubernetes/addons/ig-clusterrole.yaml -f /etc/kubernetes/addons/ig-clusterrolebinding.yaml -f /etc/kubernetes/addons/ig-crd.yaml -f /etc/kubernetes/addons/ig-daemonset.yaml
	I0831 15:06:14.931166    1563 addons.go:431] installing /etc/kubernetes/addons/rbac-external-provisioner.yaml
	I0831 15:06:14.931179    1563 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-external-provisioner.yaml --> /etc/kubernetes/addons/rbac-external-provisioner.yaml (4442 bytes)
	I0831 15:06:15.187639    1563 addons.go:431] installing /etc/kubernetes/addons/rbac-external-resizer.yaml
	I0831 15:06:15.187656    1563 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-external-resizer.yaml --> /etc/kubernetes/addons/rbac-external-resizer.yaml (2943 bytes)
	I0831 15:06:15.471689    1563 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: (2.797746152s)
	I0831 15:06:15.471717    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:15.471724    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:15.471885    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:15.471890    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:15.471899    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:15.471910    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:15.471917    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:15.472052    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:15.472078    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:15.472089    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:15.513635    1563 pod_ready.go:103] pod "coredns-6f6b679f8f-4l4hf" in "kube-system" namespace has status "Ready":"False"
	I0831 15:06:15.783708    1563 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/helm-tiller-dp.yaml -f /etc/kubernetes/addons/helm-tiller-rbac.yaml -f /etc/kubernetes/addons/helm-tiller-svc.yaml: (2.977254976s)
	I0831 15:06:15.783741    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:15.783753    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:15.783949    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:15.783961    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:15.783968    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:15.783975    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:15.784135    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:15.784149    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:15.784156    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:15.932463    1563 addons.go:431] installing /etc/kubernetes/addons/rbac-external-snapshotter.yaml
	I0831 15:06:15.932477    1563 ssh_runner.go:362] scp csi-hostpath-driver/rbac/rbac-external-snapshotter.yaml --> /etc/kubernetes/addons/rbac-external-snapshotter.yaml (3149 bytes)
	I0831 15:06:16.556995    1563 addons.go:431] installing /etc/kubernetes/addons/csi-hostpath-attacher.yaml
	I0831 15:06:16.557009    1563 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-attacher.yaml (2143 bytes)
	I0831 15:06:16.896473    1563 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/deployment.yaml: (3.895878699s)
	I0831 15:06:16.896499    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:16.896506    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:16.896632    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:16.896635    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:16.896652    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:16.896663    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:16.896671    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:16.896797    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:16.896805    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:17.258764    1563 addons.go:431] installing /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml
	I0831 15:06:17.258777    1563 ssh_runner.go:362] scp csi-hostpath-driver/deploy/csi-hostpath-driverinfo.yaml --> /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml (1274 bytes)
	I0831 15:06:17.497141    1563 addons.go:431] installing /etc/kubernetes/addons/csi-hostpath-plugin.yaml
	I0831 15:06:17.497153    1563 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-plugin.yaml (8201 bytes)
	I0831 15:06:17.796328    1563 pod_ready.go:103] pod "coredns-6f6b679f8f-4l4hf" in "kube-system" namespace has status "Ready":"False"
	I0831 15:06:17.816343    1563 addons.go:431] installing /etc/kubernetes/addons/csi-hostpath-resizer.yaml
	I0831 15:06:17.816359    1563 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-resizer.yaml (2191 bytes)
	I0831 15:06:18.150219    1563 addons.go:431] installing /etc/kubernetes/addons/csi-hostpath-storageclass.yaml
	I0831 15:06:18.150232    1563 ssh_runner.go:362] scp csi-hostpath-driver/deploy/csi-hostpath-storageclass.yaml --> /etc/kubernetes/addons/csi-hostpath-storageclass.yaml (846 bytes)
	I0831 15:06:18.301049    1563 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/rbac-external-attacher.yaml -f /etc/kubernetes/addons/rbac-hostpath.yaml -f /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml -f /etc/kubernetes/addons/rbac-external-provisioner.yaml -f /etc/kubernetes/addons/rbac-external-resizer.yaml -f /etc/kubernetes/addons/rbac-external-snapshotter.yaml -f /etc/kubernetes/addons/csi-hostpath-attacher.yaml -f /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml -f /etc/kubernetes/addons/csi-hostpath-plugin.yaml -f /etc/kubernetes/addons/csi-hostpath-resizer.yaml -f /etc/kubernetes/addons/csi-hostpath-storageclass.yaml
	I0831 15:06:19.590013    1563 ssh_runner.go:362] scp memory --> /var/lib/minikube/google_application_credentials.json (162 bytes)
	I0831 15:06:19.590037    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:06:19.590209    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:06:19.590313    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:06:19.590403    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:06:19.590513    1563 sshutil.go:53] new ssh client: &{IP:192.169.0.2 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/id_rsa Username:docker}
	I0831 15:06:19.829561    1563 pod_ready.go:103] pod "coredns-6f6b679f8f-4l4hf" in "kube-system" namespace has status "Ready":"False"
	I0831 15:06:20.062478    1563 ssh_runner.go:362] scp memory --> /var/lib/minikube/google_cloud_project (12 bytes)
	I0831 15:06:20.204144    1563 addons.go:234] Setting addon gcp-auth=true in "addons-540000"
	I0831 15:06:20.204175    1563 host.go:66] Checking if "addons-540000" exists ...
	I0831 15:06:20.204464    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:20.204481    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:20.213605    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49753
	I0831 15:06:20.213952    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:20.214296    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:20.214308    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:20.214524    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:20.214914    1563 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:06:20.214932    1563 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:06:20.224209    1563 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:49755
	I0831 15:06:20.224585    1563 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:06:20.224966    1563 main.go:141] libmachine: Using API Version  1
	I0831 15:06:20.224981    1563 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:06:20.225238    1563 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:06:20.225362    1563 main.go:141] libmachine: (addons-540000) Calling .GetState
	I0831 15:06:20.225458    1563 main.go:141] libmachine: (addons-540000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:06:20.225539    1563 main.go:141] libmachine: (addons-540000) DBG | hyperkit pid from json: 1576
	I0831 15:06:20.226579    1563 main.go:141] libmachine: (addons-540000) Calling .DriverName
	I0831 15:06:20.226781    1563 ssh_runner.go:195] Run: cat /var/lib/minikube/google_application_credentials.json
	I0831 15:06:20.226795    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHHostname
	I0831 15:06:20.226891    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHPort
	I0831 15:06:20.226986    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHKeyPath
	I0831 15:06:20.227105    1563 main.go:141] libmachine: (addons-540000) Calling .GetSSHUsername
	I0831 15:06:20.227211    1563 sshutil.go:53] new ssh client: &{IP:192.169.0.2 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/addons-540000/id_rsa Username:docker}
	I0831 15:06:21.198187    1563 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/ingress-deploy.yaml: (8.167693467s)
	I0831 15:06:21.198215    1563 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/metrics-apiservice.yaml -f /etc/kubernetes/addons/metrics-server-deployment.yaml -f /etc/kubernetes/addons/metrics-server-rbac.yaml -f /etc/kubernetes/addons/metrics-server-service.yaml: (8.066095784s)
	I0831 15:06:21.198225    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:21.198237    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:21.198251    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:21.198279    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:21.198321    1563 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: (8.047244161s)
	W0831 15:06:21.198353    1563 addons.go:457] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: Process exited with status 1
	stdout:
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotclasses.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotcontents.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshots.snapshot.storage.k8s.io created
	serviceaccount/snapshot-controller created
	clusterrole.rbac.authorization.k8s.io/snapshot-controller-runner created
	clusterrolebinding.rbac.authorization.k8s.io/snapshot-controller-role created
	role.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	rolebinding.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	deployment.apps/snapshot-controller created
	
	stderr:
	error: resource mapping not found for name: "csi-hostpath-snapclass" namespace: "" from "/etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml": no matches for kind "VolumeSnapshotClass" in version "snapshot.storage.k8s.io/v1"
	ensure CRDs are installed first
	I0831 15:06:21.198380    1563 retry.go:31] will retry after 299.281341ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: Process exited with status 1
	stdout:
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotclasses.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotcontents.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshots.snapshot.storage.k8s.io created
	serviceaccount/snapshot-controller created
	clusterrole.rbac.authorization.k8s.io/snapshot-controller-runner created
	clusterrolebinding.rbac.authorization.k8s.io/snapshot-controller-role created
	role.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	rolebinding.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	deployment.apps/snapshot-controller created
	
	stderr:
	error: resource mapping not found for name: "csi-hostpath-snapclass" namespace: "" from "/etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml": no matches for kind "VolumeSnapshotClass" in version "snapshot.storage.k8s.io/v1"
	ensure CRDs are installed first
	I0831 15:06:21.198432    1563 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/ingress-dns-pod.yaml: (7.997749539s)
	I0831 15:06:21.198445    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:21.198413    1563 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/registry-rc.yaml -f /etc/kubernetes/addons/registry-svc.yaml -f /etc/kubernetes/addons/registry-proxy.yaml: (8.041012879s)
	I0831 15:06:21.198457    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:21.198465    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:21.198475    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:21.198498    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:21.198477    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:21.198549    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:21.198565    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:21.198572    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:21.198633    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:21.198674    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:21.198686    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:21.198696    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:21.198646    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:21.198703    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:21.198750    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:21.198759    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:21.198767    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:21.198777    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:21.198782    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:21.198835    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:21.198845    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:21.198856    1563 addons.go:475] Verifying addon ingress=true in "addons-540000"
	I0831 15:06:21.198855    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:21.198986    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:21.198976    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:21.199015    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:21.199037    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:21.199044    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:21.199054    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:21.199066    1563 addons.go:475] Verifying addon registry=true in "addons-540000"
	I0831 15:06:21.199181    1563 main.go:141] libmachine: Failed to make call to close driver server: unexpected EOF
	I0831 15:06:21.199194    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:21.199202    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:21.199212    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:21.199428    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:21.200365    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:21.199466    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:21.200380    1563 addons.go:475] Verifying addon metrics-server=true in "addons-540000"
	I0831 15:06:21.225651    1563 out.go:177] * Verifying ingress addon...
	I0831 15:06:21.268052    1563 out.go:177] * Verifying registry addon...
	I0831 15:06:21.327672    1563 kapi.go:75] Waiting for pod with label "app.kubernetes.io/name=ingress-nginx" in ns "ingress-nginx" ...
	I0831 15:06:21.348615    1563 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=registry" in ns "kube-system" ...
	I0831 15:06:21.354943    1563 kapi.go:86] Found 2 Pods for label selector kubernetes.io/minikube-addons=registry
	I0831 15:06:21.354954    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:21.355828    1563 kapi.go:86] Found 3 Pods for label selector app.kubernetes.io/name=ingress-nginx
	I0831 15:06:21.355836    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:21.498468    1563 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply --force -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml
	I0831 15:06:21.914462    1563 pod_ready.go:103] pod "coredns-6f6b679f8f-4l4hf" in "kube-system" namespace has status "Ready":"False"
	I0831 15:06:21.914602    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:22.012117    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:22.350124    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:22.401814    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:22.856898    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:22.904814    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:23.353268    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:23.353512    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:23.584534    1563 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: (10.316207509s)
	I0831 15:06:23.584551    1563 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/volcano-deployment.yaml: (10.364988178s)
	I0831 15:06:23.584569    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:23.584576    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:23.584588    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:23.584580    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:23.584607    1563 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner-rancher.yaml: (9.989420328s)
	I0831 15:06:23.584627    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:23.584639    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:23.584671    1563 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/yakd-ns.yaml -f /etc/kubernetes/addons/yakd-sa.yaml -f /etc/kubernetes/addons/yakd-crb.yaml -f /etc/kubernetes/addons/yakd-svc.yaml -f /etc/kubernetes/addons/yakd-dp.yaml: (8.896043651s)
	I0831 15:06:23.584689    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:23.584697    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:23.584790    1563 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/ig-namespace.yaml -f /etc/kubernetes/addons/ig-serviceaccount.yaml -f /etc/kubernetes/addons/ig-role.yaml -f /etc/kubernetes/addons/ig-rolebinding.yaml -f /etc/kubernetes/addons/ig-clusterrole.yaml -f /etc/kubernetes/addons/ig-clusterrolebinding.yaml -f /etc/kubernetes/addons/ig-crd.yaml -f /etc/kubernetes/addons/ig-daemonset.yaml: (8.666089415s)
	I0831 15:06:23.584818    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:23.584823    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:23.584827    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:23.584827    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:23.584826    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:23.584836    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:23.584847    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:23.584855    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:23.584859    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:23.584872    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:23.584877    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:23.584884    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:23.584884    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:23.584885    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:23.584889    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:23.584862    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:23.584894    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:23.584910    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:23.584941    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:23.584949    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:23.584955    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:23.585009    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:23.585145    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:23.585169    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:23.585205    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:23.585225    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:23.585235    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:23.585239    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:23.585256    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:23.585268    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:23.585176    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:23.585287    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:23.585295    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:23.585391    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:23.585435    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:23.585442    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:23.585754    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:23.585759    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:23.585765    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:23.585767    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:23.585788    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:23.608043    1563 out.go:177] * To access YAKD - Kubernetes Dashboard, wait for Pod to be ready and run the following command:
	
		minikube -p addons-540000 service yakd-dashboard -n yakd-dashboard
	
	I0831 15:06:23.666472    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:23.666485    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:23.666695    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:23.666702    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:23.666714    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	W0831 15:06:23.666771    1563 out.go:270] ! Enabling 'default-storageclass' returned an error: running callbacks: [Error making standard the default storage class: Error while marking storage class local-path as non-default: Operation cannot be fulfilled on storageclasses.storage.k8s.io "local-path": the object has been modified; please apply your changes to the latest version and try again]
	I0831 15:06:23.714891    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:23.714903    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:23.715047    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:23.715054    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:23.853551    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:23.883490    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:24.302539    1563 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/rbac-external-attacher.yaml -f /etc/kubernetes/addons/rbac-hostpath.yaml -f /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml -f /etc/kubernetes/addons/rbac-external-provisioner.yaml -f /etc/kubernetes/addons/rbac-external-resizer.yaml -f /etc/kubernetes/addons/rbac-external-snapshotter.yaml -f /etc/kubernetes/addons/csi-hostpath-attacher.yaml -f /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml -f /etc/kubernetes/addons/csi-hostpath-plugin.yaml -f /etc/kubernetes/addons/csi-hostpath-resizer.yaml -f /etc/kubernetes/addons/csi-hostpath-storageclass.yaml: (6.001378067s)
	I0831 15:06:24.302547    1563 ssh_runner.go:235] Completed: cat /var/lib/minikube/google_application_credentials.json: (4.075698684s)
	I0831 15:06:24.302564    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:24.302572    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:24.302596    1563 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply --force -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: (2.804058545s)
	I0831 15:06:24.302614    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:24.302693    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:24.302754    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:24.302763    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:24.302773    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:24.302781    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:24.302841    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:24.302851    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:24.302854    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:24.302867    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:24.302879    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:24.302943    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:24.302951    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:24.302963    1563 addons.go:475] Verifying addon csi-hostpath-driver=true in "addons-540000"
	I0831 15:06:24.303056    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:24.303245    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:24.303090    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:24.342023    1563 out.go:177]   - Using image registry.k8s.io/ingress-nginx/kube-webhook-certgen:v1.4.3
	I0831 15:06:24.383996    1563 out.go:177] * Verifying csi-hostpath-driver addon...
	I0831 15:06:24.426025    1563 out.go:177]   - Using image gcr.io/k8s-minikube/gcp-auth-webhook:v0.1.2
	I0831 15:06:24.426700    1563 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=csi-hostpath-driver" in ns "kube-system" ...
	I0831 15:06:24.449851    1563 addons.go:431] installing /etc/kubernetes/addons/gcp-auth-ns.yaml
	I0831 15:06:24.449877    1563 ssh_runner.go:362] scp gcp-auth/gcp-auth-ns.yaml --> /etc/kubernetes/addons/gcp-auth-ns.yaml (700 bytes)
	I0831 15:06:24.456125    1563 pod_ready.go:103] pod "coredns-6f6b679f8f-4l4hf" in "kube-system" namespace has status "Ready":"False"
	I0831 15:06:24.460643    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:24.460903    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:24.461113    1563 kapi.go:86] Found 3 Pods for label selector kubernetes.io/minikube-addons=csi-hostpath-driver
	I0831 15:06:24.461122    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:24.502218    1563 addons.go:431] installing /etc/kubernetes/addons/gcp-auth-service.yaml
	I0831 15:06:24.502231    1563 ssh_runner.go:362] scp gcp-auth/gcp-auth-service.yaml --> /etc/kubernetes/addons/gcp-auth-service.yaml (788 bytes)
	I0831 15:06:24.525871    1563 addons.go:431] installing /etc/kubernetes/addons/gcp-auth-webhook.yaml
	I0831 15:06:24.525882    1563 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/gcp-auth-webhook.yaml (5421 bytes)
	I0831 15:06:24.562251    1563 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/gcp-auth-ns.yaml -f /etc/kubernetes/addons/gcp-auth-service.yaml -f /etc/kubernetes/addons/gcp-auth-webhook.yaml
	I0831 15:06:24.832420    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:24.851365    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:24.930097    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:25.260448    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:25.260464    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:25.260677    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:25.260685    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:25.260690    1563 main.go:141] libmachine: Making call to close driver server
	I0831 15:06:25.260690    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:25.260694    1563 main.go:141] libmachine: (addons-540000) Calling .Close
	I0831 15:06:25.260839    1563 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:06:25.260850    1563 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:06:25.260861    1563 main.go:141] libmachine: (addons-540000) DBG | Closing plugin on server side
	I0831 15:06:25.261728    1563 addons.go:475] Verifying addon gcp-auth=true in "addons-540000"
	I0831 15:06:25.287886    1563 out.go:177] * Verifying gcp-auth addon...
	I0831 15:06:25.362062    1563 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=gcp-auth" in ns "gcp-auth" ...
	I0831 15:06:25.368791    1563 kapi.go:86] Found 0 Pods for label selector kubernetes.io/minikube-addons=gcp-auth
	I0831 15:06:25.369092    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:25.369482    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:25.469316    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:25.832965    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:25.852572    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:25.929952    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:26.330608    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:26.350625    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:26.432064    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:26.784391    1563 pod_ready.go:103] pod "coredns-6f6b679f8f-4l4hf" in "kube-system" namespace has status "Ready":"False"
	I0831 15:06:26.831335    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:26.851076    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:26.931973    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:27.330904    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:27.351175    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:27.430424    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:27.831306    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:27.850837    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:27.929422    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:28.331293    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:28.352681    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:28.430714    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:28.833608    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:28.934304    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:28.935118    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:29.284046    1563 pod_ready.go:103] pod "coredns-6f6b679f8f-4l4hf" in "kube-system" namespace has status "Ready":"False"
	I0831 15:06:29.332317    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:29.351668    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:29.431716    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:29.831566    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:29.851908    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:29.930134    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:30.330825    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:30.352589    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:30.429766    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:30.830638    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:30.852154    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:30.931131    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:31.284814    1563 pod_ready.go:103] pod "coredns-6f6b679f8f-4l4hf" in "kube-system" namespace has status "Ready":"False"
	I0831 15:06:31.331525    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:31.436322    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:31.436884    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:31.831948    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:31.851573    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:31.932309    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:32.332478    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:32.351169    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:32.429988    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:32.832068    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:32.851289    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:32.930342    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:33.289419    1563 pod_ready.go:103] pod "coredns-6f6b679f8f-4l4hf" in "kube-system" namespace has status "Ready":"False"
	I0831 15:06:33.332665    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:33.351090    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:33.430910    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:33.833439    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:33.851189    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:33.930653    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:34.333011    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:34.351060    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:34.430399    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:34.831584    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:34.851522    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:34.931115    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:35.330857    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:35.351874    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:35.430090    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:35.784777    1563 pod_ready.go:103] pod "coredns-6f6b679f8f-4l4hf" in "kube-system" namespace has status "Ready":"False"
	I0831 15:06:35.831659    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:35.851764    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:35.929819    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:36.331653    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:36.351464    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:36.431747    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:36.830870    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:36.930741    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:36.931711    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:37.331350    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:37.352225    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:37.430355    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:37.830834    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:37.851659    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:37.929888    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:38.433392    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:38.433464    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:38.433782    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:38.434052    1563 pod_ready.go:103] pod "coredns-6f6b679f8f-4l4hf" in "kube-system" namespace has status "Ready":"False"
	I0831 15:06:38.830697    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:38.851326    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:38.930802    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:39.330641    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:39.352846    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:39.429597    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:39.830821    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:39.853059    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:39.930615    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:40.409383    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:40.409826    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:40.429533    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:40.785774    1563 pod_ready.go:103] pod "coredns-6f6b679f8f-4l4hf" in "kube-system" namespace has status "Ready":"False"
	I0831 15:06:40.830755    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:40.852996    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:40.934985    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:41.334603    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:41.351826    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:41.435211    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:41.830783    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:41.852849    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:41.929928    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:42.332613    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:42.350766    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:42.431505    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:42.836465    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:42.851116    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:42.931469    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:43.285627    1563 pod_ready.go:103] pod "coredns-6f6b679f8f-4l4hf" in "kube-system" namespace has status "Ready":"False"
	I0831 15:06:43.330988    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:43.351513    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:43.430180    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:43.830756    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:43.851007    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:43.930385    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:44.284498    1563 pod_ready.go:93] pod "coredns-6f6b679f8f-4l4hf" in "kube-system" namespace has status "Ready":"True"
	I0831 15:06:44.284512    1563 pod_ready.go:82] duration metric: took 31.004308472s for pod "coredns-6f6b679f8f-4l4hf" in "kube-system" namespace to be "Ready" ...
	I0831 15:06:44.284520    1563 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-mxdxz" in "kube-system" namespace to be "Ready" ...
	I0831 15:06:44.286608    1563 pod_ready.go:98] error getting pod "coredns-6f6b679f8f-mxdxz" in "kube-system" namespace (skipping!): pods "coredns-6f6b679f8f-mxdxz" not found
	I0831 15:06:44.286620    1563 pod_ready.go:82] duration metric: took 2.094499ms for pod "coredns-6f6b679f8f-mxdxz" in "kube-system" namespace to be "Ready" ...
	E0831 15:06:44.286628    1563 pod_ready.go:67] WaitExtra: waitPodCondition: error getting pod "coredns-6f6b679f8f-mxdxz" in "kube-system" namespace (skipping!): pods "coredns-6f6b679f8f-mxdxz" not found
	I0831 15:06:44.286633    1563 pod_ready.go:79] waiting up to 6m0s for pod "etcd-addons-540000" in "kube-system" namespace to be "Ready" ...
	I0831 15:06:44.290852    1563 pod_ready.go:93] pod "etcd-addons-540000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:06:44.290864    1563 pod_ready.go:82] duration metric: took 4.225711ms for pod "etcd-addons-540000" in "kube-system" namespace to be "Ready" ...
	I0831 15:06:44.290871    1563 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-addons-540000" in "kube-system" namespace to be "Ready" ...
	I0831 15:06:44.293903    1563 pod_ready.go:93] pod "kube-apiserver-addons-540000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:06:44.293911    1563 pod_ready.go:82] duration metric: took 3.035931ms for pod "kube-apiserver-addons-540000" in "kube-system" namespace to be "Ready" ...
	I0831 15:06:44.293917    1563 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-addons-540000" in "kube-system" namespace to be "Ready" ...
	I0831 15:06:44.298728    1563 pod_ready.go:93] pod "kube-controller-manager-addons-540000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:06:44.298738    1563 pod_ready.go:82] duration metric: took 4.816105ms for pod "kube-controller-manager-addons-540000" in "kube-system" namespace to be "Ready" ...
	I0831 15:06:44.298743    1563 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-nwvnv" in "kube-system" namespace to be "Ready" ...
	I0831 15:06:44.331794    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:44.351464    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:44.430331    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:44.482720    1563 pod_ready.go:93] pod "kube-proxy-nwvnv" in "kube-system" namespace has status "Ready":"True"
	I0831 15:06:44.482734    1563 pod_ready.go:82] duration metric: took 183.983301ms for pod "kube-proxy-nwvnv" in "kube-system" namespace to be "Ready" ...
	I0831 15:06:44.482743    1563 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-addons-540000" in "kube-system" namespace to be "Ready" ...
	I0831 15:06:44.832773    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:45.032556    1563 pod_ready.go:93] pod "kube-scheduler-addons-540000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:06:45.032570    1563 pod_ready.go:82] duration metric: took 549.814612ms for pod "kube-scheduler-addons-540000" in "kube-system" namespace to be "Ready" ...
	I0831 15:06:45.032576    1563 pod_ready.go:39] duration metric: took 31.949494125s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:06:45.032595    1563 api_server.go:52] waiting for apiserver process to appear ...
	I0831 15:06:45.033006    1563 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:06:45.035728    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:45.035886    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:45.068086    1563 api_server.go:72] duration metric: took 33.229007647s to wait for apiserver process to appear ...
	I0831 15:06:45.068099    1563 api_server.go:88] waiting for apiserver healthz status ...
	I0831 15:06:45.068116    1563 api_server.go:253] Checking apiserver healthz at https://192.169.0.2:8443/healthz ...
	I0831 15:06:45.074417    1563 api_server.go:279] https://192.169.0.2:8443/healthz returned 200:
	ok
	I0831 15:06:45.076946    1563 api_server.go:141] control plane version: v1.31.0
	I0831 15:06:45.076960    1563 api_server.go:131] duration metric: took 8.85618ms to wait for apiserver health ...
	I0831 15:06:45.076971    1563 system_pods.go:43] waiting for kube-system pods to appear ...
	I0831 15:06:45.091877    1563 system_pods.go:59] 18 kube-system pods found
	I0831 15:06:45.091900    1563 system_pods.go:61] "coredns-6f6b679f8f-4l4hf" [f27f4027-33aa-454c-8825-34f6fc764697] Running
	I0831 15:06:45.091906    1563 system_pods.go:61] "csi-hostpath-attacher-0" [3df309af-262e-4163-91f2-983330d0f9d9] Pending / Ready:ContainersNotReady (containers with unready status: [csi-attacher]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-attacher])
	I0831 15:06:45.091910    1563 system_pods.go:61] "csi-hostpath-resizer-0" [dd5c0f40-d009-4632-a979-2eb314dd46ec] Pending / Ready:ContainersNotReady (containers with unready status: [csi-resizer]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-resizer])
	I0831 15:06:45.091917    1563 system_pods.go:61] "csi-hostpathplugin-hrvqr" [97e9e8a4-9ab1-43c2-82d1-20be98f2af84] Pending / Ready:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter])
	I0831 15:06:45.091921    1563 system_pods.go:61] "etcd-addons-540000" [38fa0207-aa8a-44e7-a4ae-bdb1f0ff037b] Running
	I0831 15:06:45.091924    1563 system_pods.go:61] "kube-apiserver-addons-540000" [4c9612ce-9c7a-4bf2-99c6-aed6834f3022] Running
	I0831 15:06:45.091927    1563 system_pods.go:61] "kube-controller-manager-addons-540000" [b3756247-6132-45cf-9217-623d97b507ea] Running
	I0831 15:06:45.091930    1563 system_pods.go:61] "kube-ingress-dns-minikube" [d893f4a1-8b61-4986-94e6-8997e6dbe9be] Running
	I0831 15:06:45.091933    1563 system_pods.go:61] "kube-proxy-nwvnv" [b7b0c1ac-ed9e-43ec-ab65-af7502081cc7] Running
	I0831 15:06:45.091935    1563 system_pods.go:61] "kube-scheduler-addons-540000" [29d26c78-2e3a-43d9-bfd3-7784c3fdfdc9] Running
	I0831 15:06:45.091939    1563 system_pods.go:61] "metrics-server-84c5f94fbc-nt8q4" [770f613a-32ba-44c2-bf29-86d02a6c4f18] Pending / Ready:ContainersNotReady (containers with unready status: [metrics-server]) / ContainersReady:ContainersNotReady (containers with unready status: [metrics-server])
	I0831 15:06:45.091942    1563 system_pods.go:61] "nvidia-device-plugin-daemonset-q998b" [2d50fc12-bdb1-49e7-ae12-d5e775633fc0] Running
	I0831 15:06:45.091947    1563 system_pods.go:61] "registry-6fb4cdfc84-hbr57" [d2171259-b754-493c-a539-6115a91bf784] Pending / Ready:ContainersNotReady (containers with unready status: [registry]) / ContainersReady:ContainersNotReady (containers with unready status: [registry])
	I0831 15:06:45.091952    1563 system_pods.go:61] "registry-proxy-j5x8q" [97be67b8-2585-454b-bdbb-6a388e9592e6] Pending / Ready:ContainersNotReady (containers with unready status: [registry-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-proxy])
	I0831 15:06:45.091956    1563 system_pods.go:61] "snapshot-controller-56fcc65765-fz8sj" [ac51873e-ab20-430f-80cd-78f4940a3beb] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I0831 15:06:45.091961    1563 system_pods.go:61] "snapshot-controller-56fcc65765-tbfxh" [0eedbe18-6f6b-4275-8301-2ce7d4c36f70] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I0831 15:06:45.091964    1563 system_pods.go:61] "storage-provisioner" [90b3e2d9-94f0-44d7-8357-7ea9daee4b89] Running
	I0831 15:06:45.091968    1563 system_pods.go:61] "tiller-deploy-b48cc5f79-fd6zx" [acc8c5ea-8b97-4176-a6fc-526116814954] Running
	I0831 15:06:45.091972    1563 system_pods.go:74] duration metric: took 14.996327ms to wait for pod list to return data ...
	I0831 15:06:45.091978    1563 default_sa.go:34] waiting for default service account to be created ...
	I0831 15:06:45.284039    1563 default_sa.go:45] found service account: "default"
	I0831 15:06:45.284055    1563 default_sa.go:55] duration metric: took 192.068652ms for default service account to be created ...
	I0831 15:06:45.284062    1563 system_pods.go:116] waiting for k8s-apps to be running ...
	I0831 15:06:45.333028    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:45.353298    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:45.432188    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:45.488070    1563 system_pods.go:86] 18 kube-system pods found
	I0831 15:06:45.488086    1563 system_pods.go:89] "coredns-6f6b679f8f-4l4hf" [f27f4027-33aa-454c-8825-34f6fc764697] Running
	I0831 15:06:45.488094    1563 system_pods.go:89] "csi-hostpath-attacher-0" [3df309af-262e-4163-91f2-983330d0f9d9] Pending / Ready:ContainersNotReady (containers with unready status: [csi-attacher]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-attacher])
	I0831 15:06:45.488106    1563 system_pods.go:89] "csi-hostpath-resizer-0" [dd5c0f40-d009-4632-a979-2eb314dd46ec] Pending / Ready:ContainersNotReady (containers with unready status: [csi-resizer]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-resizer])
	I0831 15:06:45.488114    1563 system_pods.go:89] "csi-hostpathplugin-hrvqr" [97e9e8a4-9ab1-43c2-82d1-20be98f2af84] Pending / Ready:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter])
	I0831 15:06:45.488121    1563 system_pods.go:89] "etcd-addons-540000" [38fa0207-aa8a-44e7-a4ae-bdb1f0ff037b] Running
	I0831 15:06:45.488130    1563 system_pods.go:89] "kube-apiserver-addons-540000" [4c9612ce-9c7a-4bf2-99c6-aed6834f3022] Running
	I0831 15:06:45.488134    1563 system_pods.go:89] "kube-controller-manager-addons-540000" [b3756247-6132-45cf-9217-623d97b507ea] Running
	I0831 15:06:45.488138    1563 system_pods.go:89] "kube-ingress-dns-minikube" [d893f4a1-8b61-4986-94e6-8997e6dbe9be] Running
	I0831 15:06:45.488141    1563 system_pods.go:89] "kube-proxy-nwvnv" [b7b0c1ac-ed9e-43ec-ab65-af7502081cc7] Running
	I0831 15:06:45.488144    1563 system_pods.go:89] "kube-scheduler-addons-540000" [29d26c78-2e3a-43d9-bfd3-7784c3fdfdc9] Running
	I0831 15:06:45.488148    1563 system_pods.go:89] "metrics-server-84c5f94fbc-nt8q4" [770f613a-32ba-44c2-bf29-86d02a6c4f18] Pending / Ready:ContainersNotReady (containers with unready status: [metrics-server]) / ContainersReady:ContainersNotReady (containers with unready status: [metrics-server])
	I0831 15:06:45.488151    1563 system_pods.go:89] "nvidia-device-plugin-daemonset-q998b" [2d50fc12-bdb1-49e7-ae12-d5e775633fc0] Running
	I0831 15:06:45.488155    1563 system_pods.go:89] "registry-6fb4cdfc84-hbr57" [d2171259-b754-493c-a539-6115a91bf784] Pending / Ready:ContainersNotReady (containers with unready status: [registry]) / ContainersReady:ContainersNotReady (containers with unready status: [registry])
	I0831 15:06:45.488160    1563 system_pods.go:89] "registry-proxy-j5x8q" [97be67b8-2585-454b-bdbb-6a388e9592e6] Pending / Ready:ContainersNotReady (containers with unready status: [registry-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-proxy])
	I0831 15:06:45.488166    1563 system_pods.go:89] "snapshot-controller-56fcc65765-fz8sj" [ac51873e-ab20-430f-80cd-78f4940a3beb] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I0831 15:06:45.488172    1563 system_pods.go:89] "snapshot-controller-56fcc65765-tbfxh" [0eedbe18-6f6b-4275-8301-2ce7d4c36f70] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I0831 15:06:45.488175    1563 system_pods.go:89] "storage-provisioner" [90b3e2d9-94f0-44d7-8357-7ea9daee4b89] Running
	I0831 15:06:45.488179    1563 system_pods.go:89] "tiller-deploy-b48cc5f79-fd6zx" [acc8c5ea-8b97-4176-a6fc-526116814954] Running
	I0831 15:06:45.488184    1563 system_pods.go:126] duration metric: took 204.115395ms to wait for k8s-apps to be running ...
	I0831 15:06:45.488190    1563 system_svc.go:44] waiting for kubelet service to be running ....
	I0831 15:06:45.488237    1563 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:06:45.500556    1563 system_svc.go:56] duration metric: took 12.361567ms WaitForService to wait for kubelet
	I0831 15:06:45.500571    1563 kubeadm.go:582] duration metric: took 33.661488876s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:06:45.500582    1563 node_conditions.go:102] verifying NodePressure condition ...
	I0831 15:06:45.684199    1563 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:06:45.684224    1563 node_conditions.go:123] node cpu capacity is 2
	I0831 15:06:45.684236    1563 node_conditions.go:105] duration metric: took 183.647681ms to run NodePressure ...
	I0831 15:06:45.684246    1563 start.go:241] waiting for startup goroutines ...
	I0831 15:06:45.832719    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:45.853701    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:45.931404    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:46.332495    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:46.351025    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:46.432638    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:46.831704    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:46.851752    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:46.932422    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:47.330988    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:47.351703    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:47.430037    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:47.831013    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:47.851395    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:47.931260    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:48.331891    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:48.352065    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:48.430279    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:48.831134    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:48.851291    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:48.931025    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:49.331346    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:49.351451    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:49.429875    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:49.830713    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:49.851401    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:49.930816    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:50.331069    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:50.351220    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:50.429889    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:50.830989    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:50.931875    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:50.932193    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:51.331026    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:51.351739    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:51.430141    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:51.831581    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:51.853656    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:51.956214    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:52.331413    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:52.353299    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:52.430533    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:52.832495    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:52.853181    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0831 15:06:52.932845    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:53.331299    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:53.352699    1563 kapi.go:107] duration metric: took 32.003661089s to wait for kubernetes.io/minikube-addons=registry ...
	I0831 15:06:53.430385    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:53.831894    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:53.932671    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:54.331403    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:54.430294    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:54.831643    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:54.933102    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:55.331208    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:55.430496    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:55.831489    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:55.932850    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:56.331985    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:56.430935    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:56.831874    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:56.931086    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:57.331517    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:57.430242    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:57.831789    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:57.966884    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:58.334113    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:58.432012    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:58.832760    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:58.930648    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:59.331492    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:59.430659    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:06:59.831704    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:06:59.931209    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:00.332238    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:00.432055    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:00.833517    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:00.931719    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:01.333341    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:01.430406    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:01.831946    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:01.930741    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:02.330950    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:02.431305    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:02.831536    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:02.930404    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:03.331266    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:03.431385    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:03.831818    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:03.930286    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:04.330748    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:04.429893    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:04.890808    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:04.991755    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:05.331367    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:05.431309    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:05.831508    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:05.930391    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:06.330623    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:06.430618    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:06.831341    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:06.929774    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:07.331200    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:07.431730    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:07.832808    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:07.935369    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:08.331598    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:08.431338    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:08.832152    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:08.930543    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:09.331257    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:09.430883    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:09.832246    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:09.931048    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:10.331864    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:10.432265    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:10.832755    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:10.931083    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:11.331508    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:11.430118    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:11.831327    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:11.930344    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:12.331702    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:12.430863    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:12.832983    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:13.006298    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:13.331178    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:13.431021    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:13.831908    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:13.933301    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:14.331799    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:14.430555    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:14.832739    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:14.930468    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:15.331193    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:15.430695    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:15.831679    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:15.931131    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:16.331335    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:16.432228    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:16.831823    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:16.933748    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:17.331864    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:17.431406    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:17.831823    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:17.931345    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:18.331195    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:18.431284    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:18.831635    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:18.930375    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:19.332396    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:19.433835    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:19.833253    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:19.930860    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:20.332085    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:20.431901    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:20.831834    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:20.931313    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:21.331493    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:21.433714    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:21.833080    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:21.933099    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:22.331628    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:22.430714    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:22.832604    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:22.930425    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:23.331487    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:23.430584    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:23.831226    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:23.931558    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:24.331974    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:24.430493    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:24.832163    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:24.930776    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:25.332300    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:25.433365    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:25.832745    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:25.930945    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:26.331346    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:26.431539    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:26.832949    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:26.930762    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:27.333090    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:27.430727    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:27.831772    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:27.932543    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:28.332950    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:28.430961    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:28.831819    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:28.931597    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:29.331774    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:29.431526    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:29.832251    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:29.933208    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:30.331283    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:30.431997    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:30.831509    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:30.930948    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:31.332366    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:31.430565    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:31.832019    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:31.931060    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:32.331665    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:32.430891    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:32.832742    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:32.932963    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:33.332557    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:33.431606    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:33.831703    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:33.931584    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:34.331360    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:34.431706    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:34.831675    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:34.931898    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:35.331819    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:35.466444    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:35.831623    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:35.931643    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:36.334941    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:36.431644    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:36.832375    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:36.932641    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:37.332723    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:37.430545    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:37.831673    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:37.930931    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:38.332944    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:38.431494    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:38.831987    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:38.931399    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:39.331565    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:39.430816    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:39.831432    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:39.930707    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:40.331679    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:40.431151    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:40.910705    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:41.013302    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:41.331886    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:41.432225    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:41.831693    1563 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0831 15:07:41.930629    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:42.331869    1563 kapi.go:107] duration metric: took 1m21.00312952s to wait for app.kubernetes.io/name=ingress-nginx ...
	I0831 15:07:42.430925    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:42.970415    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:43.430681    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:43.930815    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:44.431580    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:44.930892    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:45.468354    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:45.930747    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:46.431237    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:46.931086    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:47.430653    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:47.967768    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:48.431498    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:48.932978    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:49.430756    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:49.930766    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:50.430482    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:50.970792    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0831 15:07:51.431577    1563 kapi.go:107] duration metric: took 1m27.003728993s to wait for kubernetes.io/minikube-addons=csi-hostpath-driver ...
	I0831 15:09:10.368144    1563 kapi.go:86] Found 1 Pods for label selector kubernetes.io/minikube-addons=gcp-auth
	I0831 15:09:10.368157    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0831 15:09:10.868210    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0831 15:09:11.367456    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0831 15:09:11.866499    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0831 15:09:12.366725    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0831 15:09:12.867948    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0831 15:09:13.368578    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0831 15:09:13.866883    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0831 15:09:14.368642    1563 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0831 15:09:14.866355    1563 kapi.go:107] duration metric: took 2m49.502059138s to wait for kubernetes.io/minikube-addons=gcp-auth ...
	I0831 15:09:14.892542    1563 out.go:177] * Your GCP credentials will now be mounted into every pod created in the addons-540000 cluster.
	I0831 15:09:14.913370    1563 out.go:177] * If you don't want your credentials mounted into a specific pod, add a label with the `gcp-auth-skip-secret` key to your pod configuration.
	I0831 15:09:14.935319    1563 out.go:177] * If you want existing pods to be mounted with credentials, either recreate them or rerun addons enable with --refresh.
	I0831 15:09:14.994206    1563 out.go:177] * Enabled addons: nvidia-device-plugin, storage-provisioner, helm-tiller, cloud-spanner, ingress-dns, metrics-server, volcano, inspektor-gadget, yakd, storage-provisioner-rancher, volumesnapshots, registry, ingress, csi-hostpath-driver, gcp-auth
	I0831 15:09:15.068132    1563 addons.go:510] duration metric: took 3m3.227071663s for enable addons: enabled=[nvidia-device-plugin storage-provisioner helm-tiller cloud-spanner ingress-dns metrics-server volcano inspektor-gadget yakd storage-provisioner-rancher volumesnapshots registry ingress csi-hostpath-driver gcp-auth]
	I0831 15:09:15.068182    1563 start.go:246] waiting for cluster config update ...
	I0831 15:09:15.068211    1563 start.go:255] writing updated cluster config ...
	I0831 15:09:15.070479    1563 ssh_runner.go:195] Run: rm -f paused
	I0831 15:09:15.121988    1563 start.go:600] kubectl: 1.29.2, cluster: 1.31.0 (minor skew: 2)
	I0831 15:09:15.143676    1563 out.go:201] 
	W0831 15:09:15.165471    1563 out.go:270] ! /usr/local/bin/kubectl is version 1.29.2, which may have incompatibilities with Kubernetes 1.31.0.
	I0831 15:09:15.186135    1563 out.go:177]   - Want kubectl v1.31.0? Try 'minikube kubectl -- get pods -A'
	I0831 15:09:15.299138    1563 out.go:177] * Done! kubectl is now configured to use "addons-540000" cluster and "default" namespace by default
	
	
	==> Docker <==
	Aug 31 22:19:09 addons-540000 dockerd[1227]: time="2024-08-31T22:19:09.825217379Z" level=warning msg="cleaning up after shim disconnected" id=9f69eeabcebec61df0a5ebe098112da73ce136e4f35fcb392f6d1a98b0c9175a namespace=moby
	Aug 31 22:19:09 addons-540000 dockerd[1227]: time="2024-08-31T22:19:09.825396898Z" level=info msg="cleaning up dead shim" namespace=moby
	Aug 31 22:19:09 addons-540000 dockerd[1221]: time="2024-08-31T22:19:09.826553252Z" level=info msg="ignoring event" container=9f69eeabcebec61df0a5ebe098112da73ce136e4f35fcb392f6d1a98b0c9175a module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Aug 31 22:19:10 addons-540000 dockerd[1227]: time="2024-08-31T22:19:10.308075158Z" level=info msg="shim disconnected" id=110c503d0b78f4f3b440fad4f1fb007492232a887c7eccd0b54a241715fee42a namespace=moby
	Aug 31 22:19:10 addons-540000 dockerd[1227]: time="2024-08-31T22:19:10.308164267Z" level=warning msg="cleaning up after shim disconnected" id=110c503d0b78f4f3b440fad4f1fb007492232a887c7eccd0b54a241715fee42a namespace=moby
	Aug 31 22:19:10 addons-540000 dockerd[1227]: time="2024-08-31T22:19:10.308222511Z" level=info msg="cleaning up dead shim" namespace=moby
	Aug 31 22:19:10 addons-540000 dockerd[1221]: time="2024-08-31T22:19:10.308512390Z" level=info msg="ignoring event" container=110c503d0b78f4f3b440fad4f1fb007492232a887c7eccd0b54a241715fee42a module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Aug 31 22:19:10 addons-540000 dockerd[1227]: time="2024-08-31T22:19:10.313205144Z" level=info msg="shim disconnected" id=1de2ebd6c1edda4f600bc8b1627576fe70782f5ce6a740c710dbe9c76ef73b17 namespace=moby
	Aug 31 22:19:10 addons-540000 dockerd[1227]: time="2024-08-31T22:19:10.313277579Z" level=warning msg="cleaning up after shim disconnected" id=1de2ebd6c1edda4f600bc8b1627576fe70782f5ce6a740c710dbe9c76ef73b17 namespace=moby
	Aug 31 22:19:10 addons-540000 dockerd[1227]: time="2024-08-31T22:19:10.313286981Z" level=info msg="cleaning up dead shim" namespace=moby
	Aug 31 22:19:10 addons-540000 dockerd[1221]: time="2024-08-31T22:19:10.314242464Z" level=info msg="ignoring event" container=1de2ebd6c1edda4f600bc8b1627576fe70782f5ce6a740c710dbe9c76ef73b17 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Aug 31 22:19:10 addons-540000 dockerd[1227]: time="2024-08-31T22:19:10.529602744Z" level=info msg="shim disconnected" id=1f9732b3d9cbfe7cfb297fedd5131e58839b9564b86cb317c52f36b75748dfd4 namespace=moby
	Aug 31 22:19:10 addons-540000 dockerd[1227]: time="2024-08-31T22:19:10.529669208Z" level=warning msg="cleaning up after shim disconnected" id=1f9732b3d9cbfe7cfb297fedd5131e58839b9564b86cb317c52f36b75748dfd4 namespace=moby
	Aug 31 22:19:10 addons-540000 dockerd[1227]: time="2024-08-31T22:19:10.529680344Z" level=info msg="cleaning up dead shim" namespace=moby
	Aug 31 22:19:10 addons-540000 dockerd[1221]: time="2024-08-31T22:19:10.529897422Z" level=info msg="ignoring event" container=1f9732b3d9cbfe7cfb297fedd5131e58839b9564b86cb317c52f36b75748dfd4 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Aug 31 22:19:10 addons-540000 dockerd[1227]: time="2024-08-31T22:19:10.605029727Z" level=info msg="shim disconnected" id=b86f264260e8d1f9cb1677ee490ea4a36af120d80a17f2626088e14be5b86924 namespace=moby
	Aug 31 22:19:10 addons-540000 dockerd[1227]: time="2024-08-31T22:19:10.605524928Z" level=warning msg="cleaning up after shim disconnected" id=b86f264260e8d1f9cb1677ee490ea4a36af120d80a17f2626088e14be5b86924 namespace=moby
	Aug 31 22:19:10 addons-540000 dockerd[1227]: time="2024-08-31T22:19:10.605605499Z" level=info msg="cleaning up dead shim" namespace=moby
	Aug 31 22:19:10 addons-540000 dockerd[1221]: time="2024-08-31T22:19:10.609856660Z" level=info msg="ignoring event" container=b86f264260e8d1f9cb1677ee490ea4a36af120d80a17f2626088e14be5b86924 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Aug 31 22:19:10 addons-540000 dockerd[1227]: time="2024-08-31T22:19:10.675006960Z" level=warning msg="cleanup warnings time=\"2024-08-31T22:19:10Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=moby
	Aug 31 22:19:11 addons-540000 cri-dockerd[1120]: time="2024-08-31T22:19:11Z" level=info msg="Stop pulling image docker.io/nginx:alpine: Status: Downloaded newer image for nginx:alpine"
	Aug 31 22:19:11 addons-540000 dockerd[1227]: time="2024-08-31T22:19:11.239833827Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:19:11 addons-540000 dockerd[1227]: time="2024-08-31T22:19:11.239923449Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:19:11 addons-540000 dockerd[1227]: time="2024-08-31T22:19:11.239933037Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:19:11 addons-540000 dockerd[1227]: time="2024-08-31T22:19:11.240053903Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                                        CREATED                  STATE               NAME                       ATTEMPT             POD ID              POD
	2a1d4cbdb635c       nginx@sha256:c04c18adc2a407740a397c8407c011fc6c90026a9b65cceddef7ae5484360158                                                Less than a second ago   Running             nginx                      0                   3801a844df0ab       nginx
	c338f6906e4ef       gcr.io/k8s-minikube/gcp-auth-webhook@sha256:e6c5b3bc32072ea370d34c27836efd11b3519d25bd444c2a8efc339cff0e20fb                 9 minutes ago            Running             gcp-auth                   0                   371b24e7b5be8       gcp-auth-89d5ffd79-9xlxx
	6a286267e7419       registry.k8s.io/ingress-nginx/controller@sha256:d5f8217feeac4887cb1ed21f27c2674e58be06bd8f5184cacea2a69abaf78dce             11 minutes ago           Running             controller                 0                   e32766aa12c4f       ingress-nginx-controller-bc57996ff-78jrp
	e4c80e2afd995       ce263a8653f9c                                                                                                                12 minutes ago           Exited              patch                      1                   6bad39d2920dd       ingress-nginx-admission-patch-fctdf
	672e618c40808       registry.k8s.io/ingress-nginx/kube-webhook-certgen@sha256:a320a50cc91bd15fd2d6fa6de58bd98c1bd64b9a6f926ce23a600d87043455a3   12 minutes ago           Exited              create                     0                   dbc63139fb1e5       ingress-nginx-admission-create-c9ms7
	4a5a90fc617d8       marcnuri/yakd@sha256:c5414196116a2266ad097b0468833b73ef1d6c7922241115fe203fb826381624                                        12 minutes ago           Running             yakd                       0                   37adfff572a6e       yakd-dashboard-67d98fc6b-f7zvd
	6c97ae2c69818       rancher/local-path-provisioner@sha256:e34c88ae0affb1cdefbb874140d6339d4a27ec4ee420ae8199cd839997b05246                       12 minutes ago           Running             local-path-provisioner     0                   2f63e55eedefd       local-path-provisioner-86d989889c-kwlpr
	ac0d2cf55825e       gcr.io/cloud-spanner-emulator/emulator@sha256:636fdfc528824bae5f0ea2eca6ae307fe81092f05ec21038008bc0d6100e52fc               12 minutes ago           Running             cloud-spanner-emulator     0                   38c7ad0e6067a       cloud-spanner-emulator-769b77f747-xw9pt
	4422a1e29d383       gcr.io/k8s-minikube/minikube-ingress-dns@sha256:4211a1de532376c881851542238121b26792225faa36a7b02dccad88fd05797c             12 minutes ago           Running             minikube-ingress-dns       0                   0944f5a83b6bc       kube-ingress-dns-minikube
	874c233ecf545       nvcr.io/nvidia/k8s-device-plugin@sha256:ed39e22c8b71343fb996737741a99da88ce6c75dd83b5c520e0b3d8e8a884c47                     12 minutes ago           Running             nvidia-device-plugin-ctr   0                   6c662ad6f2c81       nvidia-device-plugin-daemonset-q998b
	3096c60c5afa3       6e38f40d628db                                                                                                                12 minutes ago           Running             storage-provisioner        0                   1e5da85e7a77f       storage-provisioner
	34f8a5e5b3882       cbb01a7bd410d                                                                                                                12 minutes ago           Running             coredns                    0                   b0d6fd93ac674       coredns-6f6b679f8f-4l4hf
	4e9b4c61d08f9       ad83b2ca7b09e                                                                                                                12 minutes ago           Running             kube-proxy                 0                   31a131bdb30cd       kube-proxy-nwvnv
	cda9242c7e842       2e96e5913fc06                                                                                                                13 minutes ago           Running             etcd                       0                   3e239c46eb74e       etcd-addons-540000
	90f4af9d6e897       1766f54c897f0                                                                                                                13 minutes ago           Running             kube-scheduler             0                   644697d25b3d0       kube-scheduler-addons-540000
	5bf334a8d125f       045733566833c                                                                                                                13 minutes ago           Running             kube-controller-manager    0                   072fe66e3642e       kube-controller-manager-addons-540000
	646be17f77ef1       604f5db92eaa8                                                                                                                13 minutes ago           Running             kube-apiserver             0                   c7e5d0ee28d59       kube-apiserver-addons-540000
	
	
	==> controller_ingress [6a286267e741] <==
	I0831 22:07:41.492794       7 event.go:377] Event(v1.ObjectReference{Kind:"ConfigMap", Namespace:"ingress-nginx", Name:"tcp-services", UID:"ee599e20-2f64-4579-9246-eeb28e853e79", APIVersion:"v1", ResourceVersion:"709", FieldPath:""}): type: 'Normal' reason: 'CREATE' ConfigMap ingress-nginx/tcp-services
	I0831 22:07:41.492831       7 event.go:377] Event(v1.ObjectReference{Kind:"ConfigMap", Namespace:"ingress-nginx", Name:"udp-services", UID:"f282629a-7cc3-4e10-84c6-f4a2334098ef", APIVersion:"v1", ResourceVersion:"710", FieldPath:""}): type: 'Normal' reason: 'CREATE' ConfigMap ingress-nginx/udp-services
	I0831 22:07:42.674497       7 nginx.go:317] "Starting NGINX process"
	I0831 22:07:42.674733       7 leaderelection.go:250] attempting to acquire leader lease ingress-nginx/ingress-nginx-leader...
	I0831 22:07:42.677864       7 nginx.go:337] "Starting validation webhook" address=":8443" certPath="/usr/local/certificates/cert" keyPath="/usr/local/certificates/key"
	I0831 22:07:42.678436       7 controller.go:193] "Configuration changes detected, backend reload required"
	I0831 22:07:42.683013       7 leaderelection.go:260] successfully acquired lease ingress-nginx/ingress-nginx-leader
	I0831 22:07:42.683188       7 status.go:85] "New leader elected" identity="ingress-nginx-controller-bc57996ff-78jrp"
	I0831 22:07:42.695929       7 status.go:219] "POD is not ready" pod="ingress-nginx/ingress-nginx-controller-bc57996ff-78jrp" node="addons-540000"
	I0831 22:07:42.730586       7 controller.go:213] "Backend successfully reloaded"
	I0831 22:07:42.730714       7 controller.go:224] "Initial sync, sleeping for 1 second"
	I0831 22:07:42.730903       7 event.go:377] Event(v1.ObjectReference{Kind:"Pod", Namespace:"ingress-nginx", Name:"ingress-nginx-controller-bc57996ff-78jrp", UID:"1e3d18e7-f1d3-4786-aafd-08b326da9b63", APIVersion:"v1", ResourceVersion:"738", FieldPath:""}): type: 'Normal' reason: 'RELOAD' NGINX reload triggered due to a change in configuration
	W0831 22:19:07.230410       7 controller.go:1110] Error obtaining Endpoints for Service "default/nginx": no object matching key "default/nginx" in local store
	I0831 22:19:07.245087       7 admission.go:149] processed ingress via admission controller {testedIngressLength:1 testedIngressTime:0.015s renderingIngressLength:1 renderingIngressTime:0s admissionTime:0.015s testedConfigurationSize:18.1kB}
	I0831 22:19:07.245124       7 main.go:107] "successfully validated configuration, accepting" ingress="default/nginx-ingress"
	I0831 22:19:07.250369       7 store.go:440] "Found valid IngressClass" ingress="default/nginx-ingress" ingressclass="nginx"
	I0831 22:19:07.250951       7 event.go:377] Event(v1.ObjectReference{Kind:"Ingress", Namespace:"default", Name:"nginx-ingress", UID:"72febf18-1227-425f-a4b7-508f06f1e3a0", APIVersion:"networking.k8s.io/v1", ResourceVersion:"2847", FieldPath:""}): type: 'Normal' reason: 'Sync' Scheduled for sync
	W0831 22:19:07.251244       7 controller.go:1110] Error obtaining Endpoints for Service "default/nginx": no object matching key "default/nginx" in local store
	I0831 22:19:07.251382       7 controller.go:193] "Configuration changes detected, backend reload required"
	I0831 22:19:07.283220       7 controller.go:213] "Backend successfully reloaded"
	I0831 22:19:07.283545       7 event.go:377] Event(v1.ObjectReference{Kind:"Pod", Namespace:"ingress-nginx", Name:"ingress-nginx-controller-bc57996ff-78jrp", UID:"1e3d18e7-f1d3-4786-aafd-08b326da9b63", APIVersion:"v1", ResourceVersion:"738", FieldPath:""}): type: 'Normal' reason: 'RELOAD' NGINX reload triggered due to a change in configuration
	W0831 22:19:10.584845       7 controller.go:1216] Service "default/nginx" does not have any active Endpoint.
	I0831 22:19:10.584972       7 controller.go:193] "Configuration changes detected, backend reload required"
	I0831 22:19:10.631112       7 controller.go:213] "Backend successfully reloaded"
	I0831 22:19:10.631351       7 event.go:377] Event(v1.ObjectReference{Kind:"Pod", Namespace:"ingress-nginx", Name:"ingress-nginx-controller-bc57996ff-78jrp", UID:"1e3d18e7-f1d3-4786-aafd-08b326da9b63", APIVersion:"v1", ResourceVersion:"738", FieldPath:""}): type: 'Normal' reason: 'RELOAD' NGINX reload triggered due to a change in configuration
	
	
	==> coredns [34f8a5e5b388] <==
	[INFO] 10.244.0.7:53643 - 15652 "A IN registry.kube-system.svc.cluster.local. udp 56 false 512" NOERROR qr,aa,rd 110 0.000160957s
	[INFO] 10.244.0.7:42076 - 15057 "A IN registry.kube-system.svc.cluster.local.kube-system.svc.cluster.local. udp 86 false 512" NXDOMAIN qr,aa,rd 179 0.000122019s
	[INFO] 10.244.0.7:42076 - 23518 "AAAA IN registry.kube-system.svc.cluster.local.kube-system.svc.cluster.local. udp 86 false 512" NXDOMAIN qr,aa,rd 179 0.000095877s
	[INFO] 10.244.0.7:34937 - 27982 "A IN registry.kube-system.svc.cluster.local.svc.cluster.local. udp 74 false 512" NXDOMAIN qr,aa,rd 167 0.000100098s
	[INFO] 10.244.0.7:34937 - 7244 "AAAA IN registry.kube-system.svc.cluster.local.svc.cluster.local. udp 74 false 512" NXDOMAIN qr,aa,rd 167 0.000057059s
	[INFO] 10.244.0.7:60086 - 24384 "A IN registry.kube-system.svc.cluster.local.cluster.local. udp 70 false 512" NXDOMAIN qr,aa,rd 163 0.000085061s
	[INFO] 10.244.0.7:60086 - 55622 "AAAA IN registry.kube-system.svc.cluster.local.cluster.local. udp 70 false 512" NXDOMAIN qr,aa,rd 163 0.000057809s
	[INFO] 10.244.0.7:33274 - 32356 "A IN registry.kube-system.svc.cluster.local. udp 56 false 512" NOERROR qr,aa,rd 110 0.000118408s
	[INFO] 10.244.0.7:33274 - 8293 "AAAA IN registry.kube-system.svc.cluster.local. udp 56 false 512" NOERROR qr,aa,rd 149 0.000089265s
	[INFO] 10.244.0.7:53249 - 7719 "A IN registry.kube-system.svc.cluster.local.kube-system.svc.cluster.local. udp 86 false 512" NXDOMAIN qr,aa,rd 179 0.000104379s
	[INFO] 10.244.0.7:53249 - 14113 "AAAA IN registry.kube-system.svc.cluster.local.kube-system.svc.cluster.local. udp 86 false 512" NXDOMAIN qr,aa,rd 179 0.000904135s
	[INFO] 10.244.0.7:51798 - 43914 "A IN registry.kube-system.svc.cluster.local.svc.cluster.local. udp 74 false 512" NXDOMAIN qr,aa,rd 167 0.000041828s
	[INFO] 10.244.0.7:51798 - 42120 "AAAA IN registry.kube-system.svc.cluster.local.svc.cluster.local. udp 74 false 512" NXDOMAIN qr,aa,rd 167 0.000063397s
	[INFO] 10.244.0.7:44020 - 5513 "AAAA IN registry.kube-system.svc.cluster.local.cluster.local. udp 70 false 512" NXDOMAIN qr,aa,rd 163 0.000085117s
	[INFO] 10.244.0.7:44020 - 19343 "A IN registry.kube-system.svc.cluster.local.cluster.local. udp 70 false 512" NXDOMAIN qr,aa,rd 163 0.000035593s
	[INFO] 10.244.0.7:35479 - 29435 "A IN registry.kube-system.svc.cluster.local. udp 56 false 512" NOERROR qr,aa,rd 110 0.000049415s
	[INFO] 10.244.0.7:35479 - 40441 "AAAA IN registry.kube-system.svc.cluster.local. udp 56 false 512" NOERROR qr,aa,rd 149 0.000071194s
	[INFO] 10.244.0.26:56876 - 29525 "AAAA IN storage.googleapis.com.gcp-auth.svc.cluster.local. udp 78 false 1232" NXDOMAIN qr,aa,rd 160 0.000165558s
	[INFO] 10.244.0.26:37056 - 22689 "A IN storage.googleapis.com.gcp-auth.svc.cluster.local. udp 78 false 1232" NXDOMAIN qr,aa,rd 160 0.00007736s
	[INFO] 10.244.0.26:40891 - 1571 "AAAA IN storage.googleapis.com.svc.cluster.local. udp 69 false 1232" NXDOMAIN qr,aa,rd 151 0.000098188s
	[INFO] 10.244.0.26:50736 - 46654 "A IN storage.googleapis.com.svc.cluster.local. udp 69 false 1232" NXDOMAIN qr,aa,rd 151 0.000075664s
	[INFO] 10.244.0.26:57084 - 51606 "AAAA IN storage.googleapis.com.cluster.local. udp 65 false 1232" NXDOMAIN qr,aa,rd 147 0.000046075s
	[INFO] 10.244.0.26:55650 - 30990 "A IN storage.googleapis.com.cluster.local. udp 65 false 1232" NXDOMAIN qr,aa,rd 147 0.000048752s
	[INFO] 10.244.0.26:45749 - 65436 "AAAA IN storage.googleapis.com. udp 51 false 1232" NOERROR qr,rd,ra 240 0.000899815s
	[INFO] 10.244.0.26:60480 - 17098 "A IN storage.googleapis.com. udp 51 false 1232" NOERROR qr,rd,ra 496 0.001209075s
	
	
	==> describe nodes <==
	Name:               addons-540000
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=addons-540000
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=addons-540000
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_08_31T15_06_07_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	                    topology.hostpath.csi/node=addons-540000
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:06:04 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  addons-540000
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:19:09 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:15:19 +0000   Sat, 31 Aug 2024 22:06:02 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:15:19 +0000   Sat, 31 Aug 2024 22:06:02 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:15:19 +0000   Sat, 31 Aug 2024 22:06:02 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:15:19 +0000   Sat, 31 Aug 2024 22:06:09 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.2
	  Hostname:    addons-540000
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             3912944Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             3912944Ki
	  pods:               110
	System Info:
	  Machine ID:                 c27ec6afe1d44cb19063b9f8969fcf04
	  System UUID:                5cb945fd-0000-0000-9f1a-7b22b1f4e295
	  Boot ID:                    22370b64-4a2d-4d8f-ac2c-672de1bb0829
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (16 in total)
	  Namespace                   Name                                        CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                        ------------  ----------  ---------------  -------------  ---
	  default                     busybox                                     0 (0%)        0 (0%)      0 (0%)           0 (0%)         9m16s
	  default                     cloud-spanner-emulator-769b77f747-xw9pt     0 (0%)        0 (0%)      0 (0%)           0 (0%)         12m
	  default                     nginx                                       0 (0%)        0 (0%)      0 (0%)           0 (0%)         5s
	  gcp-auth                    gcp-auth-89d5ffd79-9xlxx                    0 (0%)        0 (0%)      0 (0%)           0 (0%)         10m
	  ingress-nginx               ingress-nginx-controller-bc57996ff-78jrp    100m (5%)     0 (0%)      90Mi (2%)        0 (0%)         12m
	  kube-system                 coredns-6f6b679f8f-4l4hf                    100m (5%)     0 (0%)      70Mi (1%)        170Mi (4%)     13m
	  kube-system                 etcd-addons-540000                          100m (5%)     0 (0%)      100Mi (2%)       0 (0%)         13m
	  kube-system                 kube-apiserver-addons-540000                250m (12%)    0 (0%)      0 (0%)           0 (0%)         13m
	  kube-system                 kube-controller-manager-addons-540000       200m (10%)    0 (0%)      0 (0%)           0 (0%)         13m
	  kube-system                 kube-ingress-dns-minikube                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         12m
	  kube-system                 kube-proxy-nwvnv                            0 (0%)        0 (0%)      0 (0%)           0 (0%)         13m
	  kube-system                 kube-scheduler-addons-540000                100m (5%)     0 (0%)      0 (0%)           0 (0%)         13m
	  kube-system                 nvidia-device-plugin-daemonset-q998b        0 (0%)        0 (0%)      0 (0%)           0 (0%)         12m
	  kube-system                 storage-provisioner                         0 (0%)        0 (0%)      0 (0%)           0 (0%)         12m
	  local-path-storage          local-path-provisioner-86d989889c-kwlpr     0 (0%)        0 (0%)      0 (0%)           0 (0%)         12m
	  yakd-dashboard              yakd-dashboard-67d98fc6b-f7zvd              0 (0%)        0 (0%)      128Mi (3%)       256Mi (6%)     12m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                850m (42%)   0 (0%)
	  memory             388Mi (10%)  426Mi (11%)
	  ephemeral-storage  0 (0%)       0 (0%)
	  hugepages-2Mi      0 (0%)       0 (0%)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 12m                kube-proxy       
	  Normal  NodeHasSufficientMemory  13m (x8 over 13m)  kubelet          Node addons-540000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    13m (x8 over 13m)  kubelet          Node addons-540000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     13m (x7 over 13m)  kubelet          Node addons-540000 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  13m                kubelet          Updated Node Allocatable limit across pods
	  Normal  Starting                 13m                kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  13m                kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  13m                kubelet          Node addons-540000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    13m                kubelet          Node addons-540000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     13m                kubelet          Node addons-540000 status is now: NodeHasSufficientPID
	  Normal  NodeReady                13m                kubelet          Node addons-540000 status is now: NodeReady
	  Normal  RegisteredNode           13m                node-controller  Node addons-540000 event: Registered Node addons-540000 in Controller
	
	
	==> dmesg <==
	[  +5.268323] kauditd_printk_skb: 2 callbacks suppressed
	[ +10.298210] kauditd_printk_skb: 2 callbacks suppressed
	[Aug31 22:07] kauditd_printk_skb: 14 callbacks suppressed
	[  +5.318112] kauditd_printk_skb: 22 callbacks suppressed
	[  +6.128142] kauditd_printk_skb: 35 callbacks suppressed
	[  +5.004652] kauditd_printk_skb: 10 callbacks suppressed
	[  +5.303457] kauditd_printk_skb: 34 callbacks suppressed
	[  +5.075852] kauditd_printk_skb: 38 callbacks suppressed
	[ +16.098163] kauditd_printk_skb: 16 callbacks suppressed
	[  +7.379683] kauditd_printk_skb: 38 callbacks suppressed
	[Aug31 22:08] kauditd_printk_skb: 28 callbacks suppressed
	[Aug31 22:09] kauditd_printk_skb: 40 callbacks suppressed
	[  +9.835094] kauditd_printk_skb: 28 callbacks suppressed
	[ +22.051473] kauditd_printk_skb: 9 callbacks suppressed
	[  +6.186109] kauditd_printk_skb: 2 callbacks suppressed
	[ +18.016634] kauditd_printk_skb: 20 callbacks suppressed
	[Aug31 22:10] kauditd_printk_skb: 2 callbacks suppressed
	[Aug31 22:13] kauditd_printk_skb: 28 callbacks suppressed
	[Aug31 22:18] kauditd_printk_skb: 28 callbacks suppressed
	[ +11.931953] kauditd_printk_skb: 19 callbacks suppressed
	[ +11.159668] kauditd_printk_skb: 53 callbacks suppressed
	[  +7.956518] kauditd_printk_skb: 33 callbacks suppressed
	[  +5.767958] kauditd_printk_skb: 6 callbacks suppressed
	[  +9.439436] kauditd_printk_skb: 27 callbacks suppressed
	[Aug31 22:19] kauditd_printk_skb: 4 callbacks suppressed
	
	
	==> etcd [cda9242c7e84] <==
	{"level":"info","ts":"2024-08-31T22:06:45.127480Z","caller":"traceutil/trace.go:171","msg":"trace[103865281] range","detail":"{range_begin:/registry/pods; range_end:; response_count:0; response_revision:1044; }","duration":"181.005077ms","start":"2024-08-31T22:06:44.946465Z","end":"2024-08-31T22:06:45.127470Z","steps":["trace[103865281] 'range keys from in-memory index tree'  (duration: 180.859112ms)"],"step_count":1}
	{"level":"warn","ts":"2024-08-31T22:06:45.127555Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"149.706951ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/minions/addons-540000\" ","response":"range_response_count:1 size:4570"}
	{"level":"info","ts":"2024-08-31T22:06:45.127597Z","caller":"traceutil/trace.go:171","msg":"trace[1021330334] range","detail":"{range_begin:/registry/minions/addons-540000; range_end:; response_count:1; response_revision:1044; }","duration":"149.822249ms","start":"2024-08-31T22:06:44.977769Z","end":"2024-08-31T22:06:45.127592Z","steps":["trace[1021330334] 'range keys from in-memory index tree'  (duration: 149.651583ms)"],"step_count":1}
	{"level":"warn","ts":"2024-08-31T22:06:45.127746Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"167.957717ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/pods\" limit:1 ","response":"range_response_count:0 size:5"}
	{"level":"info","ts":"2024-08-31T22:06:45.127947Z","caller":"traceutil/trace.go:171","msg":"trace[891626558] range","detail":"{range_begin:/registry/pods; range_end:; response_count:0; response_revision:1044; }","duration":"168.160903ms","start":"2024-08-31T22:06:44.959781Z","end":"2024-08-31T22:06:45.127942Z","steps":["trace[891626558] 'range keys from in-memory index tree'  (duration: 167.935148ms)"],"step_count":1}
	{"level":"warn","ts":"2024-08-31T22:06:45.127820Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"102.680571ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/pods\" limit:1 ","response":"range_response_count:0 size:5"}
	{"level":"info","ts":"2024-08-31T22:06:45.128882Z","caller":"traceutil/trace.go:171","msg":"trace[14056595] range","detail":"{range_begin:/registry/pods; range_end:; response_count:0; response_revision:1044; }","duration":"103.742522ms","start":"2024-08-31T22:06:45.025133Z","end":"2024-08-31T22:06:45.128876Z","steps":["trace[14056595] 'range keys from in-memory index tree'  (duration: 102.623154ms)"],"step_count":1}
	{"level":"info","ts":"2024-08-31T22:06:50.692361Z","caller":"traceutil/trace.go:171","msg":"trace[1875624093] linearizableReadLoop","detail":"{readStateIndex:1080; appliedIndex:1079; }","duration":"133.978587ms","start":"2024-08-31T22:06:50.558374Z","end":"2024-08-31T22:06:50.692352Z","steps":["trace[1875624093] 'read index received'  (duration: 133.873852ms)","trace[1875624093] 'applied index is now lower than readState.Index'  (duration: 104.441µs)"],"step_count":2}
	{"level":"warn","ts":"2024-08-31T22:06:50.692426Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"134.043959ms","expected-duration":"100ms","prefix":"read-only range ","request":"limit:1 keys_only:true ","response":"range_response_count:0 size:5"}
	{"level":"info","ts":"2024-08-31T22:06:50.692442Z","caller":"traceutil/trace.go:171","msg":"trace[1277465264] range","detail":"{range_begin:; range_end:; response_count:0; response_revision:1059; }","duration":"134.065628ms","start":"2024-08-31T22:06:50.558371Z","end":"2024-08-31T22:06:50.692437Z","steps":["trace[1277465264] 'agreement among raft nodes before linearized reading'  (duration: 134.037623ms)"],"step_count":1}
	{"level":"info","ts":"2024-08-31T22:06:50.692486Z","caller":"traceutil/trace.go:171","msg":"trace[2095054021] transaction","detail":"{read_only:false; response_revision:1059; number_of_response:1; }","duration":"134.720287ms","start":"2024-08-31T22:06:50.557759Z","end":"2024-08-31T22:06:50.692479Z","steps":["trace[2095054021] 'process raft request'  (duration: 134.51113ms)"],"step_count":1}
	{"level":"warn","ts":"2024-08-31T22:07:13.114319Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"134.19129ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/pods\" limit:1 ","response":"range_response_count:0 size:5"}
	{"level":"info","ts":"2024-08-31T22:07:13.114879Z","caller":"traceutil/trace.go:171","msg":"trace[2035652486] range","detail":"{range_begin:/registry/pods; range_end:; response_count:0; response_revision:1167; }","duration":"134.99409ms","start":"2024-08-31T22:07:12.979875Z","end":"2024-08-31T22:07:13.114869Z","steps":["trace[2035652486] 'range keys from in-memory index tree'  (duration: 133.906602ms)"],"step_count":1}
	{"level":"info","ts":"2024-08-31T22:07:38.190822Z","caller":"traceutil/trace.go:171","msg":"trace[289724366] transaction","detail":"{read_only:false; response_revision:1261; number_of_response:1; }","duration":"126.098262ms","start":"2024-08-31T22:07:38.064681Z","end":"2024-08-31T22:07:38.190780Z","steps":["trace[289724366] 'process raft request'  (duration: 125.999548ms)"],"step_count":1}
	{"level":"warn","ts":"2024-08-31T22:09:15.481112Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"130.035757ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/health\" ","response":"range_response_count:0 size:5"}
	{"level":"info","ts":"2024-08-31T22:09:15.481820Z","caller":"traceutil/trace.go:171","msg":"trace[874643604] range","detail":"{range_begin:/registry/health; range_end:; response_count:0; response_revision:1534; }","duration":"130.783015ms","start":"2024-08-31T22:09:15.351022Z","end":"2024-08-31T22:09:15.481805Z","steps":["trace[874643604] 'range keys from in-memory index tree'  (duration: 129.999594ms)"],"step_count":1}
	{"level":"info","ts":"2024-08-31T22:09:37.757609Z","caller":"traceutil/trace.go:171","msg":"trace[452485910] transaction","detail":"{read_only:false; response_revision:1599; number_of_response:1; }","duration":"151.853816ms","start":"2024-08-31T22:09:37.605745Z","end":"2024-08-31T22:09:37.757598Z","steps":["trace[452485910] 'process raft request'  (duration: 151.790232ms)"],"step_count":1}
	{"level":"info","ts":"2024-08-31T22:16:02.721823Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":1882}
	{"level":"info","ts":"2024-08-31T22:16:02.804159Z","caller":"mvcc/kvstore_compaction.go:69","msg":"finished scheduled compaction","compact-revision":1882,"took":"81.407274ms","hash":3471796663,"current-db-size-bytes":9150464,"current-db-size":"9.2 MB","current-db-size-in-use-bytes":4943872,"current-db-size-in-use":"4.9 MB"}
	{"level":"info","ts":"2024-08-31T22:16:02.804336Z","caller":"mvcc/hash.go:137","msg":"storing new hash","hash":3471796663,"revision":1882,"compact-revision":-1}
	{"level":"info","ts":"2024-08-31T22:19:12.294151Z","caller":"traceutil/trace.go:171","msg":"trace[1636253564] transaction","detail":"{read_only:false; response_revision:2891; number_of_response:1; }","duration":"129.326214ms","start":"2024-08-31T22:19:12.164808Z","end":"2024-08-31T22:19:12.294134Z","steps":["trace[1636253564] 'process raft request'  (duration: 92.569066ms)","trace[1636253564] 'compare'  (duration: 36.683522ms)"],"step_count":2}
	{"level":"info","ts":"2024-08-31T22:19:12.294936Z","caller":"traceutil/trace.go:171","msg":"trace[1456048006] linearizableReadLoop","detail":"{readStateIndex:3087; appliedIndex:3085; }","duration":"127.734679ms","start":"2024-08-31T22:19:12.166839Z","end":"2024-08-31T22:19:12.294573Z","steps":["trace[1456048006] 'read index received'  (duration: 90.546022ms)","trace[1456048006] 'applied index is now lower than readState.Index'  (duration: 37.188002ms)"],"step_count":2}
	{"level":"warn","ts":"2024-08-31T22:19:12.295584Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"128.731465ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/pods/kube-system/registry-proxy-j5x8q\" ","response":"range_response_count:1 size:4287"}
	{"level":"info","ts":"2024-08-31T22:19:12.295699Z","caller":"traceutil/trace.go:171","msg":"trace[1019197380] range","detail":"{range_begin:/registry/pods/kube-system/registry-proxy-j5x8q; range_end:; response_count:1; response_revision:2892; }","duration":"128.848716ms","start":"2024-08-31T22:19:12.166837Z","end":"2024-08-31T22:19:12.295685Z","steps":["trace[1019197380] 'agreement among raft nodes before linearized reading'  (duration: 128.31695ms)"],"step_count":1}
	{"level":"info","ts":"2024-08-31T22:19:12.295980Z","caller":"traceutil/trace.go:171","msg":"trace[601909591] transaction","detail":"{read_only:false; response_revision:2892; number_of_response:1; }","duration":"130.490705ms","start":"2024-08-31T22:19:12.165482Z","end":"2024-08-31T22:19:12.295973Z","steps":["trace[601909591] 'process raft request'  (duration: 128.987362ms)"],"step_count":1}
	
	
	==> gcp-auth [c338f6906e4e] <==
	2024/08/31 22:09:14 GCP Auth Webhook started!
	2024/08/31 22:09:30 Ready to marshal response ...
	2024/08/31 22:09:30 Ready to write response ...
	2024/08/31 22:09:31 Ready to marshal response ...
	2024/08/31 22:09:31 Ready to write response ...
	2024/08/31 22:09:56 Ready to marshal response ...
	2024/08/31 22:09:56 Ready to write response ...
	2024/08/31 22:09:56 Ready to marshal response ...
	2024/08/31 22:09:56 Ready to write response ...
	2024/08/31 22:09:56 Ready to marshal response ...
	2024/08/31 22:09:56 Ready to write response ...
	2024/08/31 22:18:10 Ready to marshal response ...
	2024/08/31 22:18:10 Ready to write response ...
	2024/08/31 22:18:14 Ready to marshal response ...
	2024/08/31 22:18:14 Ready to write response ...
	2024/08/31 22:18:25 Ready to marshal response ...
	2024/08/31 22:18:25 Ready to write response ...
	2024/08/31 22:18:46 Ready to marshal response ...
	2024/08/31 22:18:46 Ready to write response ...
	2024/08/31 22:19:07 Ready to marshal response ...
	2024/08/31 22:19:07 Ready to write response ...
	
	
	==> kernel <==
	 22:19:12 up 13 min,  0 users,  load average: 0.50, 0.47, 0.39
	Linux addons-540000 5.10.207 #1 SMP Wed Aug 28 20:54:17 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kube-apiserver [646be17f77ef] <==
	W0831 22:09:47.872718       1 cacher.go:171] Terminating all watchers from cacher numatopologies.nodeinfo.volcano.sh
	W0831 22:09:47.878863       1 cacher.go:171] Terminating all watchers from cacher podgroups.scheduling.volcano.sh
	W0831 22:09:47.980476       1 cacher.go:171] Terminating all watchers from cacher queues.scheduling.volcano.sh
	W0831 22:09:48.180674       1 cacher.go:171] Terminating all watchers from cacher jobs.batch.volcano.sh
	W0831 22:09:48.273833       1 cacher.go:171] Terminating all watchers from cacher jobflows.flow.volcano.sh
	W0831 22:09:48.510445       1 cacher.go:171] Terminating all watchers from cacher jobtemplates.flow.volcano.sh
	I0831 22:18:20.954361       1 controller.go:615] quota admission added evaluator for: volumesnapshots.snapshot.storage.k8s.io
	I0831 22:18:40.968263       1 handler.go:286] Adding GroupVersion snapshot.storage.k8s.io v1 to ResourceManager
	I0831 22:18:40.968483       1 handler.go:286] Adding GroupVersion snapshot.storage.k8s.io v1beta1 to ResourceManager
	I0831 22:18:40.988330       1 handler.go:286] Adding GroupVersion snapshot.storage.k8s.io v1 to ResourceManager
	I0831 22:18:40.988378       1 handler.go:286] Adding GroupVersion snapshot.storage.k8s.io v1beta1 to ResourceManager
	I0831 22:18:40.993969       1 handler.go:286] Adding GroupVersion snapshot.storage.k8s.io v1 to ResourceManager
	I0831 22:18:40.994012       1 handler.go:286] Adding GroupVersion snapshot.storage.k8s.io v1beta1 to ResourceManager
	I0831 22:18:41.009412       1 handler.go:286] Adding GroupVersion snapshot.storage.k8s.io v1 to ResourceManager
	I0831 22:18:41.009656       1 handler.go:286] Adding GroupVersion snapshot.storage.k8s.io v1beta1 to ResourceManager
	I0831 22:18:41.021810       1 handler.go:286] Adding GroupVersion snapshot.storage.k8s.io v1 to ResourceManager
	I0831 22:18:41.021850       1 handler.go:286] Adding GroupVersion snapshot.storage.k8s.io v1beta1 to ResourceManager
	W0831 22:18:41.994566       1 cacher.go:171] Terminating all watchers from cacher volumesnapshotclasses.snapshot.storage.k8s.io
	W0831 22:18:42.024044       1 cacher.go:171] Terminating all watchers from cacher volumesnapshotcontents.snapshot.storage.k8s.io
	W0831 22:18:42.140155       1 cacher.go:171] Terminating all watchers from cacher volumesnapshots.snapshot.storage.k8s.io
	I0831 22:18:58.118358       1 controller.go:129] OpenAPI AggregationController: action for item v1beta1.metrics.k8s.io: Nothing (removed from the queue).
	I0831 22:19:01.753628       1 handler.go:286] Adding GroupVersion gadget.kinvolk.io v1alpha1 to ResourceManager
	W0831 22:19:02.773227       1 cacher.go:171] Terminating all watchers from cacher traces.gadget.kinvolk.io
	I0831 22:19:07.246117       1 controller.go:615] quota admission added evaluator for: ingresses.networking.k8s.io
	I0831 22:19:07.374257       1 alloc.go:330] "allocated clusterIPs" service="default/nginx" clusterIPs={"IPv4":"10.102.100.112"}
	
	
	==> kube-controller-manager [5bf334a8d125] <==
	W0831 22:19:01.721662       1 reflector.go:561] k8s.io/client-go/metadata/metadatainformer/informer.go:138: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	E0831 22:19:01.721705       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/metadata/metadatainformer/informer.go:138: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError"
	E0831 22:19:02.775005       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/metadata/metadatainformer/informer.go:138: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError"
	W0831 22:19:02.986092       1 reflector.go:561] k8s.io/client-go/metadata/metadatainformer/informer.go:138: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	E0831 22:19:02.986174       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/metadata/metadatainformer/informer.go:138: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError"
	W0831 22:19:03.732054       1 reflector.go:561] k8s.io/client-go/metadata/metadatainformer/informer.go:138: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	E0831 22:19:03.732095       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/metadata/metadatainformer/informer.go:138: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError"
	W0831 22:19:05.534856       1 reflector.go:561] k8s.io/client-go/metadata/metadatainformer/informer.go:138: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	E0831 22:19:05.535077       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/metadata/metadatainformer/informer.go:138: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError"
	W0831 22:19:05.637214       1 reflector.go:561] k8s.io/client-go/metadata/metadatainformer/informer.go:138: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	E0831 22:19:05.637261       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/metadata/metadatainformer/informer.go:138: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError"
	W0831 22:19:06.401629       1 reflector.go:561] k8s.io/client-go/metadata/metadatainformer/informer.go:138: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	E0831 22:19:06.401713       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/metadata/metadatainformer/informer.go:138: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError"
	W0831 22:19:08.108535       1 reflector.go:561] k8s.io/client-go/metadata/metadatainformer/informer.go:138: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	E0831 22:19:08.108759       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/metadata/metadatainformer/informer.go:138: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError"
	W0831 22:19:08.237724       1 reflector.go:561] k8s.io/client-go/metadata/metadatainformer/informer.go:138: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	E0831 22:19:08.237966       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/metadata/metadatainformer/informer.go:138: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError"
	I0831 22:19:10.211538       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/registry-6fb4cdfc84" duration="3.77µs"
	I0831 22:19:11.364175       1 shared_informer.go:313] Waiting for caches to sync for resource quota
	I0831 22:19:11.364194       1 shared_informer.go:320] Caches are synced for resource quota
	W0831 22:19:11.575844       1 reflector.go:561] k8s.io/client-go/metadata/metadatainformer/informer.go:138: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	E0831 22:19:11.575871       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/metadata/metadatainformer/informer.go:138: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource" logger="UnhandledError"
	I0831 22:19:11.625536       1 shared_informer.go:313] Waiting for caches to sync for garbage collector
	I0831 22:19:11.625580       1 shared_informer.go:320] Caches are synced for garbage collector
	I0831 22:19:11.827997       1 namespace_controller.go:187] "Namespace has been deleted" logger="namespace-controller" namespace="gadget"
	
	
	==> kube-proxy [4e9b4c61d08f] <==
		add table ip kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	E0831 22:06:14.427816       1 proxier.go:734] "Error cleaning up nftables rules" err=<
		could not run nftables command: /dev/stdin:1:1-25: Error: Could not process rule: Operation not supported
		add table ip6 kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	I0831 22:06:14.435325       1 server.go:677] "Successfully retrieved node IP(s)" IPs=["192.169.0.2"]
	E0831 22:06:14.435382       1 server.go:234] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I0831 22:06:14.502889       1 server_linux.go:146] "No iptables support for family" ipFamily="IPv6"
	I0831 22:06:14.502953       1 server.go:245] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0831 22:06:14.503002       1 server_linux.go:169] "Using iptables Proxier"
	I0831 22:06:14.508022       1 proxier.go:255] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I0831 22:06:14.508241       1 server.go:483] "Version info" version="v1.31.0"
	I0831 22:06:14.508250       1 server.go:485] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:06:14.509543       1 config.go:197] "Starting service config controller"
	I0831 22:06:14.509562       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0831 22:06:14.533400       1 config.go:104] "Starting endpoint slice config controller"
	I0831 22:06:14.537007       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0831 22:06:14.542813       1 config.go:326] "Starting node config controller"
	I0831 22:06:14.542823       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0831 22:06:14.609685       1 shared_informer.go:320] Caches are synced for service config
	I0831 22:06:14.639970       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0831 22:06:14.643317       1 shared_informer.go:320] Caches are synced for node config
	
	
	==> kube-scheduler [90f4af9d6e89] <==
	W0831 22:06:04.063756       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0831 22:06:04.063915       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumeclaims\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:06:04.064048       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0831 22:06:04.064173       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"storageclasses\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError"
	W0831 22:06:04.064319       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	E0831 22:06:04.064358       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Pod: failed to list *v1.Pod: pods is forbidden: User \"system:kube-scheduler\" cannot list resource \"pods\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:06:04.064430       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0831 22:06:04.064465       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicasets\" in API group \"apps\" at the cluster scope" logger="UnhandledError"
	W0831 22:06:04.064567       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0831 22:06:04.064734       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"statefulsets\" in API group \"apps\" at the cluster scope" logger="UnhandledError"
	W0831 22:06:04.931001       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	E0831 22:06:04.931299       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicationcontrollers\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:06:04.945564       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0831 22:06:04.945616       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"storageclasses\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError"
	W0831 22:06:05.027151       1 reflector.go:561] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0831 22:06:05.027204       1 reflector.go:158] "Unhandled Error" err="runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"extension-apiserver-authentication\" is forbidden: User \"system:kube-scheduler\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\"" logger="UnhandledError"
	W0831 22:06:05.063122       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0831 22:06:05.063166       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csinodes\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError"
	W0831 22:06:05.089571       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	E0831 22:06:05.089624       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Namespace: failed to list *v1.Namespace: namespaces is forbidden: User \"system:kube-scheduler\" cannot list resource \"namespaces\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:06:05.098354       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0831 22:06:05.098565       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"statefulsets\" in API group \"apps\" at the cluster scope" logger="UnhandledError"
	W0831 22:06:05.213387       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	E0831 22:06:05.213433       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumes\" in API group \"\" at the cluster scope" logger="UnhandledError"
	I0831 22:06:07.456842       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Aug 31 22:19:07 addons-540000 kubelet[2004]: I0831 22:19:07.447546    2004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbw9l\" (UniqueName: \"kubernetes.io/projected/25365911-81eb-47e5-a5bd-f6a1878e03b6-kube-api-access-xbw9l\") pod \"nginx\" (UID: \"25365911-81eb-47e5-a5bd-f6a1878e03b6\") " pod="default/nginx"
	Aug 31 22:19:07 addons-540000 kubelet[2004]: I0831 22:19:07.447641    2004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"gcp-creds\" (UniqueName: \"kubernetes.io/host-path/25365911-81eb-47e5-a5bd-f6a1878e03b6-gcp-creds\") pod \"nginx\" (UID: \"25365911-81eb-47e5-a5bd-f6a1878e03b6\") " pod="default/nginx"
	Aug 31 22:19:07 addons-540000 kubelet[2004]: E0831 22:19:07.496175    2004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"busybox\" with ImagePullBackOff: \"Back-off pulling image \\\"gcr.io/k8s-minikube/busybox:1.28.4-glibc\\\"\"" pod="default/busybox" podUID="1dd6ae83-dce2-417f-a520-906f838905cb"
	Aug 31 22:19:07 addons-540000 kubelet[2004]: I0831 22:19:07.796028    2004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3801a844df0abca96f0f596add0661537ef170c1061e7193812d5448d65184b6"
	Aug 31 22:19:10 addons-540000 kubelet[2004]: I0831 22:19:10.065644    2004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"gcp-creds\" (UniqueName: \"kubernetes.io/host-path/d4abf269-7c89-4e5d-9898-affb7154dfc1-gcp-creds\") pod \"d4abf269-7c89-4e5d-9898-affb7154dfc1\" (UID: \"d4abf269-7c89-4e5d-9898-affb7154dfc1\") "
	Aug 31 22:19:10 addons-540000 kubelet[2004]: I0831 22:19:10.065689    2004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqmx2\" (UniqueName: \"kubernetes.io/projected/d4abf269-7c89-4e5d-9898-affb7154dfc1-kube-api-access-rqmx2\") pod \"d4abf269-7c89-4e5d-9898-affb7154dfc1\" (UID: \"d4abf269-7c89-4e5d-9898-affb7154dfc1\") "
	Aug 31 22:19:10 addons-540000 kubelet[2004]: I0831 22:19:10.065760    2004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4abf269-7c89-4e5d-9898-affb7154dfc1-gcp-creds" (OuterVolumeSpecName: "gcp-creds") pod "d4abf269-7c89-4e5d-9898-affb7154dfc1" (UID: "d4abf269-7c89-4e5d-9898-affb7154dfc1"). InnerVolumeSpecName "gcp-creds". PluginName "kubernetes.io/host-path", VolumeGidValue ""
	Aug 31 22:19:10 addons-540000 kubelet[2004]: I0831 22:19:10.071262    2004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4abf269-7c89-4e5d-9898-affb7154dfc1-kube-api-access-rqmx2" (OuterVolumeSpecName: "kube-api-access-rqmx2") pod "d4abf269-7c89-4e5d-9898-affb7154dfc1" (UID: "d4abf269-7c89-4e5d-9898-affb7154dfc1"). InnerVolumeSpecName "kube-api-access-rqmx2". PluginName "kubernetes.io/projected", VolumeGidValue ""
	Aug 31 22:19:10 addons-540000 kubelet[2004]: I0831 22:19:10.167011    2004 reconciler_common.go:288] "Volume detached for volume \"gcp-creds\" (UniqueName: \"kubernetes.io/host-path/d4abf269-7c89-4e5d-9898-affb7154dfc1-gcp-creds\") on node \"addons-540000\" DevicePath \"\""
	Aug 31 22:19:10 addons-540000 kubelet[2004]: I0831 22:19:10.167083    2004 reconciler_common.go:288] "Volume detached for volume \"kube-api-access-rqmx2\" (UniqueName: \"kubernetes.io/projected/d4abf269-7c89-4e5d-9898-affb7154dfc1-kube-api-access-rqmx2\") on node \"addons-540000\" DevicePath \"\""
	Aug 31 22:19:10 addons-540000 kubelet[2004]: I0831 22:19:10.971732    2004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f485\" (UniqueName: \"kubernetes.io/projected/d2171259-b754-493c-a539-6115a91bf784-kube-api-access-8f485\") pod \"d2171259-b754-493c-a539-6115a91bf784\" (UID: \"d2171259-b754-493c-a539-6115a91bf784\") "
	Aug 31 22:19:10 addons-540000 kubelet[2004]: I0831 22:19:10.984390    2004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2171259-b754-493c-a539-6115a91bf784-kube-api-access-8f485" (OuterVolumeSpecName: "kube-api-access-8f485") pod "d2171259-b754-493c-a539-6115a91bf784" (UID: "d2171259-b754-493c-a539-6115a91bf784"). InnerVolumeSpecName "kube-api-access-8f485". PluginName "kubernetes.io/projected", VolumeGidValue ""
	Aug 31 22:19:11 addons-540000 kubelet[2004]: I0831 22:19:11.073053    2004 reconciler_common.go:288] "Volume detached for volume \"kube-api-access-8f485\" (UniqueName: \"kubernetes.io/projected/d2171259-b754-493c-a539-6115a91bf784-kube-api-access-8f485\") on node \"addons-540000\" DevicePath \"\""
	Aug 31 22:19:11 addons-540000 kubelet[2004]: I0831 22:19:11.120546    2004 scope.go:117] "RemoveContainer" containerID="110c503d0b78f4f3b440fad4f1fb007492232a887c7eccd0b54a241715fee42a"
	Aug 31 22:19:11 addons-540000 kubelet[2004]: I0831 22:19:11.165446    2004 scope.go:117] "RemoveContainer" containerID="1de2ebd6c1edda4f600bc8b1627576fe70782f5ce6a740c710dbe9c76ef73b17"
	Aug 31 22:19:11 addons-540000 kubelet[2004]: I0831 22:19:11.202293    2004 scope.go:117] "RemoveContainer" containerID="1de2ebd6c1edda4f600bc8b1627576fe70782f5ce6a740c710dbe9c76ef73b17"
	Aug 31 22:19:11 addons-540000 kubelet[2004]: E0831 22:19:11.207224    2004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = Unknown desc = Error response from daemon: No such container: 1de2ebd6c1edda4f600bc8b1627576fe70782f5ce6a740c710dbe9c76ef73b17" containerID="1de2ebd6c1edda4f600bc8b1627576fe70782f5ce6a740c710dbe9c76ef73b17"
	Aug 31 22:19:11 addons-540000 kubelet[2004]: I0831 22:19:11.207249    2004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"docker","ID":"1de2ebd6c1edda4f600bc8b1627576fe70782f5ce6a740c710dbe9c76ef73b17"} err="failed to get container status \"1de2ebd6c1edda4f600bc8b1627576fe70782f5ce6a740c710dbe9c76ef73b17\": rpc error: code = Unknown desc = Error response from daemon: No such container: 1de2ebd6c1edda4f600bc8b1627576fe70782f5ce6a740c710dbe9c76ef73b17"
	Aug 31 22:19:11 addons-540000 kubelet[2004]: I0831 22:19:11.274819    2004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grm74\" (UniqueName: \"kubernetes.io/projected/97be67b8-2585-454b-bdbb-6a388e9592e6-kube-api-access-grm74\") pod \"97be67b8-2585-454b-bdbb-6a388e9592e6\" (UID: \"97be67b8-2585-454b-bdbb-6a388e9592e6\") "
	Aug 31 22:19:11 addons-540000 kubelet[2004]: I0831 22:19:11.277320    2004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97be67b8-2585-454b-bdbb-6a388e9592e6-kube-api-access-grm74" (OuterVolumeSpecName: "kube-api-access-grm74") pod "97be67b8-2585-454b-bdbb-6a388e9592e6" (UID: "97be67b8-2585-454b-bdbb-6a388e9592e6"). InnerVolumeSpecName "kube-api-access-grm74". PluginName "kubernetes.io/projected", VolumeGidValue ""
	Aug 31 22:19:11 addons-540000 kubelet[2004]: I0831 22:19:11.376034    2004 reconciler_common.go:288] "Volume detached for volume \"kube-api-access-grm74\" (UniqueName: \"kubernetes.io/projected/97be67b8-2585-454b-bdbb-6a388e9592e6-kube-api-access-grm74\") on node \"addons-540000\" DevicePath \"\""
	Aug 31 22:19:12 addons-540000 kubelet[2004]: I0831 22:19:12.339764    2004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="default/nginx" podStartSLOduration=2.038554343 podStartE2EDuration="5.339749762s" podCreationTimestamp="2024-08-31 22:19:07 +0000 UTC" firstStartedPulling="2024-08-31 22:19:07.85114 +0000 UTC m=+781.526984023" lastFinishedPulling="2024-08-31 22:19:11.152335418 +0000 UTC m=+784.828179442" observedRunningTime="2024-08-31 22:19:12.166058763 +0000 UTC m=+785.841902791" watchObservedRunningTime="2024-08-31 22:19:12.339749762 +0000 UTC m=+786.015593791"
	Aug 31 22:19:12 addons-540000 kubelet[2004]: I0831 22:19:12.501319    2004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97be67b8-2585-454b-bdbb-6a388e9592e6" path="/var/lib/kubelet/pods/97be67b8-2585-454b-bdbb-6a388e9592e6/volumes"
	Aug 31 22:19:12 addons-540000 kubelet[2004]: I0831 22:19:12.501632    2004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2171259-b754-493c-a539-6115a91bf784" path="/var/lib/kubelet/pods/d2171259-b754-493c-a539-6115a91bf784/volumes"
	Aug 31 22:19:12 addons-540000 kubelet[2004]: I0831 22:19:12.501915    2004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4abf269-7c89-4e5d-9898-affb7154dfc1" path="/var/lib/kubelet/pods/d4abf269-7c89-4e5d-9898-affb7154dfc1/volumes"
	
	
	==> storage-provisioner [3096c60c5afa] <==
	I0831 22:06:17.818426       1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
	I0831 22:06:17.827568       1 storage_provisioner.go:141] Storage provisioner initialized, now starting service!
	I0831 22:06:17.827629       1 leaderelection.go:243] attempting to acquire leader lease kube-system/k8s.io-minikube-hostpath...
	I0831 22:06:17.837856       1 leaderelection.go:253] successfully acquired lease kube-system/k8s.io-minikube-hostpath
	I0831 22:06:17.838062       1 controller.go:835] Starting provisioner controller k8s.io/minikube-hostpath_addons-540000_e466a124-0b70-4283-a663-8ade994343bc!
	I0831 22:06:17.842004       1 event.go:282] Event(v1.ObjectReference{Kind:"Endpoints", Namespace:"kube-system", Name:"k8s.io-minikube-hostpath", UID:"8194773a-4210-4d80-bf29-cb55627dab8a", APIVersion:"v1", ResourceVersion:"534", FieldPath:""}): type: 'Normal' reason: 'LeaderElection' addons-540000_e466a124-0b70-4283-a663-8ade994343bc became leader
	I0831 22:06:17.939146       1 controller.go:884] Started provisioner controller k8s.io/minikube-hostpath_addons-540000_e466a124-0b70-4283-a663-8ade994343bc!
	

                                                
                                                
-- /stdout --
helpers_test.go:255: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p addons-540000 -n addons-540000
helpers_test.go:262: (dbg) Run:  kubectl --context addons-540000 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:273: non-running pods: busybox ingress-nginx-admission-create-c9ms7 ingress-nginx-admission-patch-fctdf
helpers_test.go:275: ======> post-mortem[TestAddons/parallel/Registry]: describe non-running pods <======
helpers_test.go:278: (dbg) Run:  kubectl --context addons-540000 describe pod busybox ingress-nginx-admission-create-c9ms7 ingress-nginx-admission-patch-fctdf
helpers_test.go:278: (dbg) Non-zero exit: kubectl --context addons-540000 describe pod busybox ingress-nginx-admission-create-c9ms7 ingress-nginx-admission-patch-fctdf: exit status 1 (56.078684ms)

                                                
                                                
-- stdout --
	Name:             busybox
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             addons-540000/192.169.0.2
	Start Time:       Sat, 31 Aug 2024 15:09:56 -0700
	Labels:           integration-test=busybox
	Annotations:      <none>
	Status:           Pending
	IP:               10.244.0.28
	IPs:
	  IP:  10.244.0.28
	Containers:
	  busybox:
	    Container ID:  
	    Image:         gcr.io/k8s-minikube/busybox:1.28.4-glibc
	    Image ID:      
	    Port:          <none>
	    Host Port:     <none>
	    Command:
	      sleep
	      3600
	    State:          Waiting
	      Reason:       ImagePullBackOff
	    Ready:          False
	    Restart Count:  0
	    Environment:
	      GOOGLE_APPLICATION_CREDENTIALS:  /google-app-creds.json
	      PROJECT_ID:                      this_is_fake
	      GCP_PROJECT:                     this_is_fake
	      GCLOUD_PROJECT:                  this_is_fake
	      GOOGLE_CLOUD_PROJECT:            this_is_fake
	      CLOUDSDK_CORE_PROJECT:           this_is_fake
	    Mounts:
	      /google-app-creds.json from gcp-creds (ro)
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-wxtxh (ro)
	Conditions:
	  Type                        Status
	  PodReadyToStartContainers   True 
	  Initialized                 True 
	  Ready                       False 
	  ContainersReady             False 
	  PodScheduled                True 
	Volumes:
	  kube-api-access-wxtxh:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    ConfigMapOptional:       <nil>
	    DownwardAPI:             true
	  gcp-creds:
	    Type:          HostPath (bare host directory volume)
	    Path:          /var/lib/minikube/google_application_credentials.json
	    HostPathType:  File
	QoS Class:         BestEffort
	Node-Selectors:    <none>
	Tolerations:       node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                   node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason     Age                    From               Message
	  ----     ------     ----                   ----               -------
	  Normal   Scheduled  9m17s                  default-scheduler  Successfully assigned default/busybox to addons-540000
	  Normal   Pulling    7m54s (x4 over 9m16s)  kubelet            Pulling image "gcr.io/k8s-minikube/busybox:1.28.4-glibc"
	  Warning  Failed     7m54s (x4 over 9m16s)  kubelet            Failed to pull image "gcr.io/k8s-minikube/busybox:1.28.4-glibc": Error response from daemon: Head "https://gcr.io/v2/k8s-minikube/busybox/manifests/1.28.4-glibc": unauthorized: authentication failed
	  Warning  Failed     7m54s (x4 over 9m16s)  kubelet            Error: ErrImagePull
	  Warning  Failed     7m31s (x6 over 9m16s)  kubelet            Error: ImagePullBackOff
	  Normal   BackOff    4m6s (x21 over 9m16s)  kubelet            Back-off pulling image "gcr.io/k8s-minikube/busybox:1.28.4-glibc"

                                                
                                                
-- /stdout --
** stderr ** 
	Error from server (NotFound): pods "ingress-nginx-admission-create-c9ms7" not found
	Error from server (NotFound): pods "ingress-nginx-admission-patch-fctdf" not found

                                                
                                                
** /stderr **
helpers_test.go:280: kubectl --context addons-540000 describe pod busybox ingress-nginx-admission-create-c9ms7 ingress-nginx-admission-patch-fctdf: exit status 1
--- FAIL: TestAddons/parallel/Registry (74.03s)

                                                
                                    
x
+
TestCertOptions (251.74s)

                                                
                                                
=== RUN   TestCertOptions
=== PAUSE TestCertOptions

                                                
                                                

                                                
                                                
=== CONT  TestCertOptions
cert_options_test.go:49: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-options-884000 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=hyperkit 
E0831 16:24:08.580857    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:24:15.464267    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:24:36.295975    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
cert_options_test.go:49: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p cert-options-884000 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=hyperkit : exit status 80 (4m6.057263933s)

                                                
                                                
-- stdout --
	* [cert-options-884000] minikube v1.33.1 on Darwin 14.6.1
	  - MINIKUBE_LOCATION=18943
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting "cert-options-884000" primary control-plane node in "cert-options-884000" cluster
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	* Deleting "cert-options-884000" in hyperkit ...
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 9a:61:f2:20:3d:8e
	* Failed to start hyperkit VM. Running "minikube delete -p cert-options-884000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 4a:f5:6e:a2:c:33
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 4a:f5:6e:a2:c:33
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
cert_options_test.go:51: failed to start minikube with args: "out/minikube-darwin-amd64 start -p cert-options-884000 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=hyperkit " : exit status 80
cert_options_test.go:60: (dbg) Run:  out/minikube-darwin-amd64 -p cert-options-884000 ssh "openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt"
cert_options_test.go:60: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p cert-options-884000 ssh "openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt": exit status 50 (162.989289ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to DRV_CP_ENDPOINT: Unable to get control-plane node cert-options-884000 endpoint: failed to lookup ip for ""
	* Suggestion: 
	
	    Recreate the cluster by running:
	    minikube delete <no value>
	    minikube start <no value>

                                                
                                                
** /stderr **
cert_options_test.go:62: failed to read apiserver cert inside minikube. args "out/minikube-darwin-amd64 -p cert-options-884000 ssh \"openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt\"": exit status 50
cert_options_test.go:69: apiserver cert does not include 127.0.0.1 in SAN.
cert_options_test.go:69: apiserver cert does not include 192.168.15.15 in SAN.
cert_options_test.go:69: apiserver cert does not include localhost in SAN.
cert_options_test.go:69: apiserver cert does not include www.google.com in SAN.
cert_options_test.go:88: (dbg) Run:  kubectl --context cert-options-884000 config view
cert_options_test.go:93: Kubeconfig apiserver server port incorrect. Output of 
'kubectl config view' = "\n-- stdout --\n\tapiVersion: v1\n\tclusters: null\n\tcontexts: null\n\tcurrent-context: \"\"\n\tkind: Config\n\tpreferences: {}\n\tusers: null\n\n-- /stdout --"
cert_options_test.go:100: (dbg) Run:  out/minikube-darwin-amd64 ssh -p cert-options-884000 -- "sudo cat /etc/kubernetes/admin.conf"
cert_options_test.go:100: (dbg) Non-zero exit: out/minikube-darwin-amd64 ssh -p cert-options-884000 -- "sudo cat /etc/kubernetes/admin.conf": exit status 50 (162.227702ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to DRV_CP_ENDPOINT: Unable to get control-plane node cert-options-884000 endpoint: failed to lookup ip for ""
	* Suggestion: 
	
	    Recreate the cluster by running:
	    minikube delete <no value>
	    minikube start <no value>

                                                
                                                
** /stderr **
cert_options_test.go:102: failed to SSH to minikube with args: "out/minikube-darwin-amd64 ssh -p cert-options-884000 -- \"sudo cat /etc/kubernetes/admin.conf\"" : exit status 50
cert_options_test.go:106: Internal minikube kubeconfig (admin.conf) does not contains the right api port. 
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to DRV_CP_ENDPOINT: Unable to get control-plane node cert-options-884000 endpoint: failed to lookup ip for ""
	* Suggestion: 
	
	    Recreate the cluster by running:
	    minikube delete <no value>
	    minikube start <no value>

                                                
                                                
** /stderr **
cert_options_test.go:109: *** TestCertOptions FAILED at 2024-08-31 16:27:01.817104 -0700 PDT m=+4919.010657486
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p cert-options-884000 -n cert-options-884000
helpers_test.go:240: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p cert-options-884000 -n cert-options-884000: exit status 7 (77.988271ms)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0831 16:27:01.893474    6352 status.go:352] failed to get driver ip: getting IP: IP address is not set
	E0831 16:27:01.893497    6352 status.go:249] status error: getting IP: IP address is not set

                                                
                                                
** /stderr **
helpers_test.go:240: status error: exit status 7 (may be ok)
helpers_test.go:242: "cert-options-884000" host is not running, skipping log retrieval (state="Error")
helpers_test.go:176: Cleaning up "cert-options-884000" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 delete -p cert-options-884000
helpers_test.go:179: (dbg) Done: out/minikube-darwin-amd64 delete -p cert-options-884000: (5.236753669s)
--- FAIL: TestCertOptions (251.74s)

                                                
                                    
x
+
TestCertExpiration (1744.3s)

                                                
                                                
=== RUN   TestCertExpiration
=== PAUSE TestCertExpiration

                                                
                                                

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:123: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-expiration-144000 --memory=2048 --cert-expiration=3m --driver=hyperkit 
E0831 16:21:52.452479    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:22:35.820080    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
cert_options_test.go:123: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p cert-expiration-144000 --memory=2048 --cert-expiration=3m --driver=hyperkit : exit status 80 (4m6.423283193s)

                                                
                                                
-- stdout --
	* [cert-expiration-144000] minikube v1.33.1 on Darwin 14.6.1
	  - MINIKUBE_LOCATION=18943
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting "cert-expiration-144000" primary control-plane node in "cert-expiration-144000" cluster
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	* Deleting "cert-expiration-144000" in hyperkit ...
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 2e:d:a1:f9:19:88
	* Failed to start hyperkit VM. Running "minikube delete -p cert-expiration-144000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 92:b:53:4:fe:c9
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 92:b:53:4:fe:c9
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
cert_options_test.go:125: failed to start minikube with args: "out/minikube-darwin-amd64 start -p cert-expiration-144000 --memory=2048 --cert-expiration=3m --driver=hyperkit " : exit status 80
cert_options_test.go:131: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-expiration-144000 --memory=2048 --cert-expiration=8760h --driver=hyperkit 
cert_options_test.go:131: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p cert-expiration-144000 --memory=2048 --cert-expiration=8760h --driver=hyperkit : exit status 80 (21m52.55005043s)

                                                
                                                
-- stdout --
	* [cert-expiration-144000] minikube v1.33.1 on Darwin 14.6.1
	  - MINIKUBE_LOCATION=18943
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on existing profile
	* Starting "cert-expiration-144000" primary control-plane node in "cert-expiration-144000" cluster
	* Updating the running hyperkit "cert-expiration-144000" VM ...
	* Updating the running hyperkit "cert-expiration-144000" VM ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! StartHost failed, but will try again: provision: Temporary Error: error getting ip during provisioning: IP address is not set
	* Failed to start hyperkit VM. Running "minikube delete -p cert-expiration-144000" may fix it: provision: Temporary Error: error getting ip during provisioning: IP address is not set
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: provision: Temporary Error: error getting ip during provisioning: IP address is not set
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
cert_options_test.go:133: failed to start minikube after cert expiration: "out/minikube-darwin-amd64 start -p cert-expiration-144000 --memory=2048 --cert-expiration=8760h --driver=hyperkit " : exit status 80
cert_options_test.go:136: minikube start output did not warn about expired certs: 
-- stdout --
	* [cert-expiration-144000] minikube v1.33.1 on Darwin 14.6.1
	  - MINIKUBE_LOCATION=18943
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on existing profile
	* Starting "cert-expiration-144000" primary control-plane node in "cert-expiration-144000" cluster
	* Updating the running hyperkit "cert-expiration-144000" VM ...
	* Updating the running hyperkit "cert-expiration-144000" VM ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! StartHost failed, but will try again: provision: Temporary Error: error getting ip during provisioning: IP address is not set
	* Failed to start hyperkit VM. Running "minikube delete -p cert-expiration-144000" may fix it: provision: Temporary Error: error getting ip during provisioning: IP address is not set
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: provision: Temporary Error: error getting ip during provisioning: IP address is not set
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
cert_options_test.go:138: *** TestCertExpiration FAILED at 2024-08-31 16:50:51.137892 -0700 PDT m=+6348.247860420
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p cert-expiration-144000 -n cert-expiration-144000
helpers_test.go:240: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p cert-expiration-144000 -n cert-expiration-144000: exit status 7 (87.676803ms)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0831 16:50:51.223444    8043 status.go:352] failed to get driver ip: getting IP: IP address is not set
	E0831 16:50:51.223468    8043 status.go:249] status error: getting IP: IP address is not set

                                                
                                                
** /stderr **
helpers_test.go:240: status error: exit status 7 (may be ok)
helpers_test.go:242: "cert-expiration-144000" host is not running, skipping log retrieval (state="Error")
helpers_test.go:176: Cleaning up "cert-expiration-144000" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 delete -p cert-expiration-144000
helpers_test.go:179: (dbg) Done: out/minikube-darwin-amd64 delete -p cert-expiration-144000: (5.241291005s)
--- FAIL: TestCertExpiration (1744.30s)

                                                
                                    
x
+
TestDockerFlags (252.12s)

                                                
                                                
=== RUN   TestDockerFlags
=== PAUSE TestDockerFlags

                                                
                                                

                                                
                                                
=== CONT  TestDockerFlags
docker_test.go:51: (dbg) Run:  out/minikube-darwin-amd64 start -p docker-flags-031000 --cache-images=false --memory=2048 --install-addons=false --wait=false --docker-env=FOO=BAR --docker-env=BAZ=BAT --docker-opt=debug --docker-opt=icc=true --alsologtostderr -v=5 --driver=hyperkit 
E0831 16:19:08.578302    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:19:08.585481    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:19:08.597264    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:19:08.620648    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:19:08.662795    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:19:08.746201    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:19:08.909473    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:19:09.232752    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:19:09.874890    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:19:11.157445    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:19:13.718989    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:19:15.463290    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:19:18.840662    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:19:29.082655    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:19:49.565208    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:20:30.528648    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
docker_test.go:51: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p docker-flags-031000 --cache-images=false --memory=2048 --install-addons=false --wait=false --docker-env=FOO=BAR --docker-env=BAZ=BAT --docker-opt=debug --docker-opt=icc=true --alsologtostderr -v=5 --driver=hyperkit : exit status 80 (4m6.355222871s)

                                                
                                                
-- stdout --
	* [docker-flags-031000] minikube v1.33.1 on Darwin 14.6.1
	  - MINIKUBE_LOCATION=18943
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting "docker-flags-031000" primary control-plane node in "docker-flags-031000" cluster
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	* Deleting "docker-flags-031000" in hyperkit ...
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 16:18:43.325391    6185 out.go:345] Setting OutFile to fd 1 ...
	I0831 16:18:43.325658    6185 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 16:18:43.325664    6185 out.go:358] Setting ErrFile to fd 2...
	I0831 16:18:43.325667    6185 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 16:18:43.325884    6185 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 16:18:43.327334    6185 out.go:352] Setting JSON to false
	I0831 16:18:43.349922    6185 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":4694,"bootTime":1725141629,"procs":441,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.6.1","kernelVersion":"23.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0831 16:18:43.350027    6185 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0831 16:18:43.373101    6185 out.go:177] * [docker-flags-031000] minikube v1.33.1 on Darwin 14.6.1
	I0831 16:18:43.419999    6185 out.go:177]   - MINIKUBE_LOCATION=18943
	I0831 16:18:43.420008    6185 notify.go:220] Checking for updates...
	I0831 16:18:43.462627    6185 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 16:18:43.483797    6185 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0831 16:18:43.504595    6185 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0831 16:18:43.525433    6185 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 16:18:43.545666    6185 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0831 16:18:43.567062    6185 config.go:182] Loaded profile config "force-systemd-flag-286000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 16:18:43.567155    6185 driver.go:392] Setting default libvirt URI to qemu:///system
	I0831 16:18:43.595579    6185 out.go:177] * Using the hyperkit driver based on user configuration
	I0831 16:18:43.635717    6185 start.go:297] selected driver: hyperkit
	I0831 16:18:43.635734    6185 start.go:901] validating driver "hyperkit" against <nil>
	I0831 16:18:43.635746    6185 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0831 16:18:43.638818    6185 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 16:18:43.638932    6185 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18943-957/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0831 16:18:43.647465    6185 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0831 16:18:43.651392    6185 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:18:43.651414    6185 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0831 16:18:43.651448    6185 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0831 16:18:43.651651    6185 start_flags.go:942] Waiting for no components: map[apiserver:false apps_running:false default_sa:false extra:false kubelet:false node_ready:false system_pods:false]
	I0831 16:18:43.651683    6185 cni.go:84] Creating CNI manager for ""
	I0831 16:18:43.651700    6185 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0831 16:18:43.651705    6185 start_flags.go:319] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0831 16:18:43.651768    6185 start.go:340] cluster config:
	{Name:docker-flags-031000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[FOO=BAR BAZ=BAT] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[debug icc=true] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 ClusterName:docker-flags-031000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:
[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:false EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:false apps_running:false default_sa:false extra:false kubelet:false node_ready:false system_pods:false] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientP
ath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 16:18:43.651861    6185 iso.go:125] acquiring lock: {Name:mk6e91575b208577856769ef01f8e000bc57c787 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 16:18:43.693744    6185 out.go:177] * Starting "docker-flags-031000" primary control-plane node in "docker-flags-031000" cluster
	I0831 16:18:43.714737    6185 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 16:18:43.714776    6185 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4
	I0831 16:18:43.714793    6185 cache.go:56] Caching tarball of preloaded images
	I0831 16:18:43.714910    6185 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 16:18:43.714921    6185 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 16:18:43.715000    6185 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/docker-flags-031000/config.json ...
	I0831 16:18:43.715020    6185 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/docker-flags-031000/config.json: {Name:mk925c1521d3ece3fd8b51d53d83109806c5c953 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 16:18:43.715379    6185 start.go:360] acquireMachinesLock for docker-flags-031000: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 16:19:40.638734    6185 start.go:364] duration metric: took 56.922966978s to acquireMachinesLock for "docker-flags-031000"
	I0831 16:19:40.638782    6185 start.go:93] Provisioning new machine with config: &{Name:docker-flags-031000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[FOO=BAR BAZ=BAT] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[debug icc=true] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSH
Key: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 ClusterName:docker-flags-031000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:false EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:false apps_running:false default_sa:false extra:false kubelet:false node_ready:false system_pods:false] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountI
P: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 16:19:40.638842    6185 start.go:125] createHost starting for "" (driver="hyperkit")
	I0831 16:19:40.660344    6185 out.go:235] * Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	I0831 16:19:40.660476    6185 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:19:40.660510    6185 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:19:40.669547    6185 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53718
	I0831 16:19:40.669988    6185 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:19:40.670625    6185 main.go:141] libmachine: Using API Version  1
	I0831 16:19:40.670634    6185 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:19:40.671017    6185 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:19:40.671196    6185 main.go:141] libmachine: (docker-flags-031000) Calling .GetMachineName
	I0831 16:19:40.671293    6185 main.go:141] libmachine: (docker-flags-031000) Calling .DriverName
	I0831 16:19:40.671418    6185 start.go:159] libmachine.API.Create for "docker-flags-031000" (driver="hyperkit")
	I0831 16:19:40.671441    6185 client.go:168] LocalClient.Create starting
	I0831 16:19:40.671473    6185 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem
	I0831 16:19:40.671521    6185 main.go:141] libmachine: Decoding PEM data...
	I0831 16:19:40.671537    6185 main.go:141] libmachine: Parsing certificate...
	I0831 16:19:40.671593    6185 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem
	I0831 16:19:40.671629    6185 main.go:141] libmachine: Decoding PEM data...
	I0831 16:19:40.671640    6185 main.go:141] libmachine: Parsing certificate...
	I0831 16:19:40.671652    6185 main.go:141] libmachine: Running pre-create checks...
	I0831 16:19:40.671659    6185 main.go:141] libmachine: (docker-flags-031000) Calling .PreCreateCheck
	I0831 16:19:40.671728    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:40.671875    6185 main.go:141] libmachine: (docker-flags-031000) Calling .GetConfigRaw
	I0831 16:19:40.681350    6185 main.go:141] libmachine: Creating machine...
	I0831 16:19:40.681376    6185 main.go:141] libmachine: (docker-flags-031000) Calling .Create
	I0831 16:19:40.681478    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:40.681597    6185 main.go:141] libmachine: (docker-flags-031000) DBG | I0831 16:19:40.681473    6210 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 16:19:40.681642    6185 main.go:141] libmachine: (docker-flags-031000) Downloading /Users/jenkins/minikube-integration/18943-957/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso...
	I0831 16:19:40.907655    6185 main.go:141] libmachine: (docker-flags-031000) DBG | I0831 16:19:40.907558    6210 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/id_rsa...
	I0831 16:19:41.026045    6185 main.go:141] libmachine: (docker-flags-031000) DBG | I0831 16:19:41.025967    6210 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/docker-flags-031000.rawdisk...
	I0831 16:19:41.026059    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Writing magic tar header
	I0831 16:19:41.026068    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Writing SSH key tar header
	I0831 16:19:41.026634    6185 main.go:141] libmachine: (docker-flags-031000) DBG | I0831 16:19:41.026588    6210 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000 ...
	I0831 16:19:41.393217    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:41.393239    6185 main.go:141] libmachine: (docker-flags-031000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/hyperkit.pid
	I0831 16:19:41.393280    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Using UUID a00b8fd6-876f-459a-b994-dcfefdecf460
	I0831 16:19:41.418506    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Generated MAC 4e:6d:b9:70:e1:40
	I0831 16:19:41.418529    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=docker-flags-031000
	I0831 16:19:41.418584    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:41 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"a00b8fd6-876f-459a-b994-dcfefdecf460", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process
:(*os.Process)(nil)}
	I0831 16:19:41.418622    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:41 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"a00b8fd6-876f-459a-b994-dcfefdecf460", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process
:(*os.Process)(nil)}
	I0831 16:19:41.418667    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:41 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/hyperkit.pid", "-c", "2", "-m", "2048M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "a00b8fd6-876f-459a-b994-dcfefdecf460", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/docker-flags-031000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/bzimage,/Users/jenkins/minikub
e-integration/18943-957/.minikube/machines/docker-flags-031000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=docker-flags-031000"}
	I0831 16:19:41.418711    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:41 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/hyperkit.pid -c 2 -m 2048M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U a00b8fd6-876f-459a-b994-dcfefdecf460 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/docker-flags-031000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000
/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=docker-flags-031000"
	I0831 16:19:41.418723    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:41 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 16:19:41.421654    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:41 DEBUG: hyperkit: Pid is 6211
	I0831 16:19:41.422109    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 0
	I0831 16:19:41.422142    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:41.422237    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:19:41.423329    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:19:41.423407    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:41.423421    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:41.423451    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:41.423472    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:41.423506    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:41.423526    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:41.423540    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:41.423551    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:41.423560    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:41.423569    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:41.423584    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:41.423599    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:41.423608    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:41.423617    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:41.423626    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:41.423635    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:41.423647    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:41.423659    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:41.429217    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:41 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 16:19:41.437643    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:41 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 16:19:41.438318    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:41 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:19:41.438340    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:41 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:19:41.438349    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:41 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:19:41.438361    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:41 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:19:41.810783    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:41 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 16:19:41.810804    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:41 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 16:19:41.925355    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:41 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:19:41.925376    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:41 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:19:41.925388    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:41 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:19:41.925397    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:41 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:19:41.926257    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:41 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 16:19:41.926276    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:41 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 16:19:43.424187    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 1
	I0831 16:19:43.424206    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:43.424297    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:19:43.425092    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:19:43.425151    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:43.425163    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:43.425172    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:43.425179    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:43.425196    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:43.425211    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:43.425227    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:43.425235    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:43.425243    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:43.425257    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:43.425270    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:43.425289    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:43.425298    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:43.425305    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:43.425314    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:43.425335    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:43.425348    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:43.425364    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:45.426099    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 2
	I0831 16:19:45.426114    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:45.426221    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:19:45.426991    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:19:45.427066    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:45.427079    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:45.427094    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:45.427108    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:45.427119    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:45.427127    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:45.427136    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:45.427142    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:45.427148    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:45.427156    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:45.427171    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:45.427179    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:45.427186    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:45.427194    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:45.427201    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:45.427209    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:45.427222    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:45.427231    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:47.316406    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:47 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0831 16:19:47.316599    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:47 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0831 16:19:47.316635    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:47 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0831 16:19:47.336669    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:19:47 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0831 16:19:47.429113    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 3
	I0831 16:19:47.429164    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:47.429326    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:19:47.430811    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:19:47.430899    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:47.430923    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:47.430956    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:47.430981    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:47.431000    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:47.431029    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:47.431046    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:47.431063    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:47.431075    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:47.431085    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:47.431095    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:47.431106    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:47.431122    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:47.431152    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:47.431162    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:47.431186    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:47.431199    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:47.431210    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:49.431429    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 4
	I0831 16:19:49.431451    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:49.431549    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:19:49.432325    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:19:49.432378    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:49.432393    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:49.432409    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:49.432431    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:49.432444    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:49.432451    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:49.432459    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:49.432469    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:49.432480    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:49.432508    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:49.432519    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:49.432530    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:49.432538    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:49.432544    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:49.432553    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:49.432559    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:49.432567    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:49.432583    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:51.434538    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 5
	I0831 16:19:51.434550    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:51.434659    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:19:51.435461    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:19:51.435499    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:51.435506    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:51.435518    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:51.435525    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:51.435532    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:51.435538    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:51.435545    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:51.435552    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:51.435559    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:51.435567    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:51.435586    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:51.435600    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:51.435615    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:51.435628    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:51.435641    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:51.435648    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:51.435655    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:51.435662    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:53.437167    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 6
	I0831 16:19:53.437181    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:53.437251    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:19:53.438031    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:19:53.438082    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:53.438096    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:53.438111    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:53.438120    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:53.438127    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:53.438137    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:53.438146    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:53.438157    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:53.438164    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:53.438174    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:53.438198    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:53.438209    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:53.438217    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:53.438226    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:53.438233    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:53.438240    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:53.438248    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:53.438253    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:55.440020    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 7
	I0831 16:19:55.440036    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:55.440046    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:19:55.441112    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:19:55.441152    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:55.441164    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:55.441174    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:55.441183    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:55.441198    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:55.441210    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:55.441222    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:55.441232    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:55.441240    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:55.441247    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:55.441259    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:55.441270    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:55.441279    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:55.441286    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:55.441305    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:55.441320    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:55.441327    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:55.441336    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:57.442015    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 8
	I0831 16:19:57.442026    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:57.442115    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:19:57.442863    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:19:57.442925    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:57.442935    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:57.442950    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:57.442961    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:57.442971    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:57.442980    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:57.442987    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:57.442993    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:57.443007    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:57.443022    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:57.443030    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:57.443037    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:57.443044    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:57.443052    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:57.443059    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:57.443064    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:57.443073    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:57.443083    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:59.444577    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 9
	I0831 16:19:59.444591    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:59.444660    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:19:59.445689    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:19:59.445736    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:59.445749    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:59.445765    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:59.445774    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:59.445781    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:59.445789    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:59.445798    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:59.445806    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:59.445818    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:59.445825    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:59.445833    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:59.445839    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:59.445847    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:59.445854    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:59.445862    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:59.445868    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:59.445876    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:59.445885    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:01.447889    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 10
	I0831 16:20:01.447922    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:01.447967    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:01.448761    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:20:01.448808    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:01.448819    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:01.448828    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:01.448835    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:01.448842    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:01.448848    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:01.448853    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:01.448859    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:01.448865    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:01.448871    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:01.448876    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:01.448883    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:01.448894    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:01.448909    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:01.448928    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:01.448936    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:01.448958    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:01.448973    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:03.450370    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 11
	I0831 16:20:03.450397    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:03.450445    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:03.451216    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:20:03.451274    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:03.451285    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:03.451296    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:03.451307    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:03.451315    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:03.451322    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:03.451328    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:03.451336    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:03.451349    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:03.451356    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:03.451362    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:03.451370    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:03.451379    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:03.451390    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:03.451397    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:03.451403    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:03.451411    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:03.451420    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:05.452606    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 12
	I0831 16:20:05.452619    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:05.452691    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:05.453518    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:20:05.453561    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:05.453583    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:05.453602    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:05.453615    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:05.453624    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:05.453631    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:05.453639    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:05.453646    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:05.453651    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:05.453669    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:05.453678    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:05.453691    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:05.453701    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:05.453709    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:05.453717    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:05.453723    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:05.453735    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:05.453748    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:07.455006    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 13
	I0831 16:20:07.455017    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:07.455137    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:07.456162    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:20:07.456212    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:07.456226    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:07.456241    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:07.456254    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:07.456263    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:07.456270    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:07.456277    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:07.456286    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:07.456303    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:07.456315    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:07.456324    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:07.456332    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:07.456341    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:07.456348    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:07.456355    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:07.456363    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:07.456369    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:07.456375    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:09.458385    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 14
	I0831 16:20:09.458397    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:09.458467    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:09.459522    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:20:09.459579    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:09.459590    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:09.459601    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:09.459612    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:09.459620    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:09.459626    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:09.459632    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:09.459640    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:09.459652    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:09.459666    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:09.459685    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:09.459694    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:09.459703    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:09.459712    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:09.459719    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:09.459725    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:09.459733    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:09.459741    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:11.461189    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 15
	I0831 16:20:11.461205    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:11.461266    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:11.462096    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:20:11.462140    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:11.462151    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:11.462161    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:11.462167    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:11.462174    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:11.462182    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:11.462189    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:11.462198    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:11.462210    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:11.462218    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:11.462225    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:11.462233    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:11.462251    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:11.462266    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:11.462274    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:11.462282    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:11.462299    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:11.462307    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:13.462914    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 16
	I0831 16:20:13.462929    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:13.462992    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:13.463806    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:20:13.463855    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:13.463865    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:13.463878    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:13.463887    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:13.463897    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:13.463904    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:13.463911    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:13.463920    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:13.463928    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:13.463933    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:13.463943    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:13.463952    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:13.463958    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:13.463965    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:13.463972    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:13.463978    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:13.463984    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:13.463990    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:15.466051    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 17
	I0831 16:20:15.466075    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:15.466123    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:15.466923    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:20:15.466958    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:15.466970    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:15.466991    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:15.467001    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:15.467019    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:15.467029    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:15.467036    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:15.467044    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:15.467051    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:15.467057    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:15.467078    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:15.467090    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:15.467098    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:15.467107    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:15.467117    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:15.467132    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:15.467140    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:15.467148    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:17.469175    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 18
	I0831 16:20:17.469187    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:17.469316    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:17.470164    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:20:17.470221    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:17.470237    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:17.470252    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:17.470262    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:17.470269    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:17.470276    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:17.470284    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:17.470291    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:17.470309    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:17.470315    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:17.470336    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:17.470348    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:17.470355    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:17.470364    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:17.470370    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:17.470382    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:17.470390    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:17.470396    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:19.471116    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 19
	I0831 16:20:19.471129    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:19.471200    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:19.472002    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:20:19.472048    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:19.472058    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:19.472068    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:19.472090    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:19.472099    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:19.472105    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:19.472113    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:19.472122    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:19.472136    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:19.472150    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:19.472158    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:19.472166    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:19.472180    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:19.472189    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:19.472200    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:19.472208    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:19.472215    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:19.472227    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:21.472260    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 20
	I0831 16:20:21.472274    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:21.472352    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:21.473126    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:20:21.473190    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:21.473208    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:21.473220    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:21.473226    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:21.473234    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:21.473242    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:21.473256    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:21.473264    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:21.473271    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:21.473279    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:21.473284    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:21.473292    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:21.473300    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:21.473308    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:21.473315    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:21.473322    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:21.473335    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:21.473347    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:23.473645    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 21
	I0831 16:20:23.473660    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:23.473699    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:23.474469    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:20:23.474527    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:23.474538    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:23.474546    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:23.474562    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:23.474575    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:23.474591    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:23.474609    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:23.474621    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:23.474629    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:23.474637    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:23.474647    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:23.474656    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:23.474664    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:23.474671    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:23.474683    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:23.474695    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:23.474750    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:23.474759    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:25.475516    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 22
	I0831 16:20:25.475528    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:25.475615    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:25.476474    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:20:25.476517    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:25.476529    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:25.476538    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:25.476548    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:25.476557    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:25.476569    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:25.476584    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:25.476597    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:25.476607    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:25.476614    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:25.476621    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:25.476629    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:25.476645    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:25.476662    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:25.476678    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:25.476687    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:25.476695    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:25.476703    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:27.478729    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 23
	I0831 16:20:27.478744    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:27.478818    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:27.479643    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:20:27.479688    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:27.479700    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:27.479712    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:27.479729    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:27.479750    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:27.479762    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:27.479769    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:27.479778    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:27.479785    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:27.479793    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:27.479801    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:27.479809    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:27.479816    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:27.479824    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:27.479833    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:27.479838    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:27.479849    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:27.479868    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:29.481885    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 24
	I0831 16:20:29.481896    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:29.481950    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:29.482717    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:20:29.482744    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:29.482754    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:29.482762    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:29.482772    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:29.482780    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:29.482789    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:29.482796    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:29.482802    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:29.482808    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:29.482815    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:29.482827    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:29.482844    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:29.482857    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:29.482867    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:29.482877    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:29.482885    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:29.482893    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:29.482902    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:31.484974    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 25
	I0831 16:20:31.484991    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:31.485071    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:31.486155    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:20:31.486221    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:31.486233    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:31.486243    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:31.486263    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:31.486278    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:31.486299    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:31.486310    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:31.486321    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:31.486328    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:31.486336    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:31.486349    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:31.486357    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:31.486364    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:31.486372    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:31.486382    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:31.486388    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:31.486395    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:31.486404    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:33.488393    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 26
	I0831 16:20:33.488407    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:33.488445    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:33.489430    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:20:33.489473    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:33.489482    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:33.489494    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:33.489499    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:33.489506    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:33.489512    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:33.489520    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:33.489525    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:33.489551    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:33.489564    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:33.489573    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:33.489590    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:33.489603    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:33.489617    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:33.489625    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:33.489638    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:33.489652    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:33.489672    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:35.491443    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 27
	I0831 16:20:35.491455    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:35.491518    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:35.492292    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:20:35.492329    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:35.492340    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:35.492354    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:35.492361    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:35.492373    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:35.492387    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:35.492396    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:35.492404    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:35.492411    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:35.492418    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:35.492482    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:35.492519    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:35.492549    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:35.492560    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:35.492573    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:35.492587    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:35.492602    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:35.492614    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:37.493167    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 28
	I0831 16:20:37.493190    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:37.493250    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:37.494068    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:20:37.494112    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:37.494123    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:37.494132    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:37.494138    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:37.494145    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:37.494153    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:37.494173    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:37.494187    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:37.494199    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:37.494207    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:37.494217    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:37.494227    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:37.494240    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:37.494254    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:37.494263    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:37.494272    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:37.494280    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:37.494288    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:39.494382    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 29
	I0831 16:20:39.494396    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:39.494471    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:39.495267    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 4e:6d:b9:70:e1:40 in /var/db/dhcpd_leases ...
	I0831 16:20:39.495335    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:39.495346    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:39.495360    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:39.495371    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:39.495377    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:39.495386    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:39.495398    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:39.495419    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:39.495431    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:39.495438    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:39.495444    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:39.495450    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:39.495470    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:39.495487    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:39.495499    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:39.495517    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:39.495529    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:39.495539    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:41.497521    6185 client.go:171] duration metric: took 1m0.825669286s to LocalClient.Create
	I0831 16:20:43.497744    6185 start.go:128] duration metric: took 1m2.858476849s to createHost
	I0831 16:20:43.497760    6185 start.go:83] releasing machines lock for "docker-flags-031000", held for 1m2.858598158s
	W0831 16:20:43.497775    6185 start.go:714] error starting host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 4e:6d:b9:70:e1:40
	I0831 16:20:43.498155    6185 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:20:43.498174    6185 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:20:43.507405    6185 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53720
	I0831 16:20:43.508000    6185 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:20:43.508461    6185 main.go:141] libmachine: Using API Version  1
	I0831 16:20:43.508474    6185 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:20:43.508832    6185 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:20:43.509249    6185 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:20:43.509268    6185 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:20:43.517931    6185 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53722
	I0831 16:20:43.518445    6185 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:20:43.518836    6185 main.go:141] libmachine: Using API Version  1
	I0831 16:20:43.518853    6185 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:20:43.519137    6185 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:20:43.519270    6185 main.go:141] libmachine: (docker-flags-031000) Calling .GetState
	I0831 16:20:43.519381    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:43.519446    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:43.520429    6185 main.go:141] libmachine: (docker-flags-031000) Calling .DriverName
	I0831 16:20:43.561191    6185 out.go:177] * Deleting "docker-flags-031000" in hyperkit ...
	I0831 16:20:43.603078    6185 main.go:141] libmachine: (docker-flags-031000) Calling .Remove
	I0831 16:20:43.603205    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:43.603223    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:43.603283    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:43.604228    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:43.604293    6185 main.go:141] libmachine: (docker-flags-031000) DBG | waiting for graceful shutdown
	I0831 16:20:44.606397    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:44.606511    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:44.607497    6185 main.go:141] libmachine: (docker-flags-031000) DBG | waiting for graceful shutdown
	I0831 16:20:45.607604    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:45.607713    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:45.609441    6185 main.go:141] libmachine: (docker-flags-031000) DBG | waiting for graceful shutdown
	I0831 16:20:46.611574    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:46.611620    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:46.612200    6185 main.go:141] libmachine: (docker-flags-031000) DBG | waiting for graceful shutdown
	I0831 16:20:47.612680    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:47.612756    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:47.613324    6185 main.go:141] libmachine: (docker-flags-031000) DBG | waiting for graceful shutdown
	I0831 16:20:48.613642    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:48.613770    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6211
	I0831 16:20:48.614936    6185 main.go:141] libmachine: (docker-flags-031000) DBG | sending sigkill
	I0831 16:20:48.614947    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:48.624862    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:20:48 WARN : hyperkit: failed to read stderr: EOF
	I0831 16:20:48.624882    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:20:48 WARN : hyperkit: failed to read stdout: EOF
	W0831 16:20:48.641551    6185 out.go:270] ! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 4e:6d:b9:70:e1:40
	! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 4e:6d:b9:70:e1:40
	I0831 16:20:48.641569    6185 start.go:729] Will try again in 5 seconds ...
	I0831 16:20:53.642114    6185 start.go:360] acquireMachinesLock for docker-flags-031000: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 16:21:46.301572    6185 start.go:364] duration metric: took 52.65907285s to acquireMachinesLock for "docker-flags-031000"
	I0831 16:21:46.301595    6185 start.go:93] Provisioning new machine with config: &{Name:docker-flags-031000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[FOO=BAR BAZ=BAT] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[debug icc=true] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSH
Key: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 ClusterName:docker-flags-031000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:false EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:false apps_running:false default_sa:false extra:false kubelet:false node_ready:false system_pods:false] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountI
P: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 16:21:46.301678    6185 start.go:125] createHost starting for "" (driver="hyperkit")
	I0831 16:21:46.343985    6185 out.go:235] * Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	I0831 16:21:46.344042    6185 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:21:46.344065    6185 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:21:46.353184    6185 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53727
	I0831 16:21:46.353704    6185 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:21:46.354209    6185 main.go:141] libmachine: Using API Version  1
	I0831 16:21:46.354224    6185 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:21:46.354469    6185 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:21:46.354595    6185 main.go:141] libmachine: (docker-flags-031000) Calling .GetMachineName
	I0831 16:21:46.354685    6185 main.go:141] libmachine: (docker-flags-031000) Calling .DriverName
	I0831 16:21:46.354821    6185 start.go:159] libmachine.API.Create for "docker-flags-031000" (driver="hyperkit")
	I0831 16:21:46.354862    6185 client.go:168] LocalClient.Create starting
	I0831 16:21:46.354889    6185 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem
	I0831 16:21:46.354942    6185 main.go:141] libmachine: Decoding PEM data...
	I0831 16:21:46.354956    6185 main.go:141] libmachine: Parsing certificate...
	I0831 16:21:46.354999    6185 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem
	I0831 16:21:46.355037    6185 main.go:141] libmachine: Decoding PEM data...
	I0831 16:21:46.355051    6185 main.go:141] libmachine: Parsing certificate...
	I0831 16:21:46.355063    6185 main.go:141] libmachine: Running pre-create checks...
	I0831 16:21:46.355068    6185 main.go:141] libmachine: (docker-flags-031000) Calling .PreCreateCheck
	I0831 16:21:46.355193    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:46.355222    6185 main.go:141] libmachine: (docker-flags-031000) Calling .GetConfigRaw
	I0831 16:21:46.365483    6185 main.go:141] libmachine: Creating machine...
	I0831 16:21:46.365493    6185 main.go:141] libmachine: (docker-flags-031000) Calling .Create
	I0831 16:21:46.365586    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:46.365786    6185 main.go:141] libmachine: (docker-flags-031000) DBG | I0831 16:21:46.365590    6237 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 16:21:46.365901    6185 main.go:141] libmachine: (docker-flags-031000) Downloading /Users/jenkins/minikube-integration/18943-957/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso...
	I0831 16:21:46.793427    6185 main.go:141] libmachine: (docker-flags-031000) DBG | I0831 16:21:46.793349    6237 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/id_rsa...
	I0831 16:21:46.975522    6185 main.go:141] libmachine: (docker-flags-031000) DBG | I0831 16:21:46.975469    6237 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/docker-flags-031000.rawdisk...
	I0831 16:21:46.975533    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Writing magic tar header
	I0831 16:21:46.975550    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Writing SSH key tar header
	I0831 16:21:46.975870    6185 main.go:141] libmachine: (docker-flags-031000) DBG | I0831 16:21:46.975839    6237 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000 ...
	I0831 16:21:47.337495    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:47.337515    6185 main.go:141] libmachine: (docker-flags-031000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/hyperkit.pid
	I0831 16:21:47.337527    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Using UUID d7223e57-6657-4031-ba9d-32f55f0c4ccf
	I0831 16:21:47.363038    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Generated MAC 9e:ad:83:7e:88:14
	I0831 16:21:47.363059    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=docker-flags-031000
	I0831 16:21:47.363092    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:47 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"d7223e57-6657-4031-ba9d-32f55f0c4ccf", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000122330)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process
:(*os.Process)(nil)}
	I0831 16:21:47.363124    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:47 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"d7223e57-6657-4031-ba9d-32f55f0c4ccf", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000122330)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process
:(*os.Process)(nil)}
	I0831 16:21:47.363174    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:47 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/hyperkit.pid", "-c", "2", "-m", "2048M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "d7223e57-6657-4031-ba9d-32f55f0c4ccf", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/docker-flags-031000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/bzimage,/Users/jenkins/minikub
e-integration/18943-957/.minikube/machines/docker-flags-031000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=docker-flags-031000"}
	I0831 16:21:47.363222    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:47 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/hyperkit.pid -c 2 -m 2048M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U d7223e57-6657-4031-ba9d-32f55f0c4ccf -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/docker-flags-031000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000
/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=docker-flags-031000"
	I0831 16:21:47.363245    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:47 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 16:21:47.366298    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:47 DEBUG: hyperkit: Pid is 6251
	I0831 16:21:47.366746    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 0
	I0831 16:21:47.366759    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:47.366848    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:21:47.367829    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:21:47.367912    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:47.367925    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:47.367955    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:47.367980    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:47.367993    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:47.368006    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:47.368021    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:47.368059    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:47.368076    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:47.368090    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:47.368108    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:47.368126    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:47.368142    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:47.368156    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:47.368172    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:47.368197    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:47.368213    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:47.368225    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:47.373676    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:47 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 16:21:47.381651    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:47 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/docker-flags-031000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 16:21:47.382550    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:47 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:21:47.382576    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:47 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:21:47.382590    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:47 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:21:47.382601    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:47 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:21:47.760108    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:47 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 16:21:47.760125    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:47 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 16:21:47.874680    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:47 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:21:47.874698    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:47 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:21:47.874711    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:47 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:21:47.874726    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:47 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:21:47.875588    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:47 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 16:21:47.875607    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:47 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 16:21:49.368651    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 1
	I0831 16:21:49.368670    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:49.368784    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:21:49.369577    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:21:49.369615    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:49.369622    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:49.369633    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:49.369640    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:49.369648    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:49.369656    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:49.369662    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:49.369667    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:49.369681    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:49.369694    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:49.369702    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:49.369710    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:49.369723    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:49.369743    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:49.369751    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:49.369759    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:49.369766    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:49.369773    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:51.371759    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 2
	I0831 16:21:51.371780    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:51.371875    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:21:51.372799    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:21:51.372853    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:51.372865    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:51.372889    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:51.372915    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:51.372932    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:51.372945    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:51.372960    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:51.372969    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:51.372984    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:51.372995    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:51.373003    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:51.373022    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:51.373037    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:51.373050    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:51.373058    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:51.373065    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:51.373085    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:51.373096    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:53.249232    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:53 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0831 16:21:53.249412    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:53 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0831 16:21:53.249423    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:53 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0831 16:21:53.269933    6185 main.go:141] libmachine: (docker-flags-031000) DBG | 2024/08/31 16:21:53 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0831 16:21:53.373426    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 3
	I0831 16:21:53.373451    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:53.373643    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:21:53.375088    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:21:53.375185    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:53.375207    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:53.375249    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:53.375264    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:53.375277    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:53.375292    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:53.375309    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:53.375323    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:53.375359    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:53.375396    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:53.375407    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:53.375419    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:53.375431    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:53.375442    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:53.375452    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:53.375461    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:53.375469    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:53.375481    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:55.376153    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 4
	I0831 16:21:55.376166    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:55.376281    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:21:55.377069    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:21:55.377128    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:55.377136    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:55.377148    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:55.377158    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:55.377166    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:55.377173    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:55.377188    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:55.377205    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:55.377218    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:55.377231    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:55.377242    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:55.377252    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:55.377260    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:55.377268    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:55.377277    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:55.377283    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:55.377289    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:55.377377    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:57.379284    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 5
	I0831 16:21:57.379304    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:57.379368    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:21:57.380303    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:21:57.380352    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:57.380365    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:57.380374    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:57.380400    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:57.380410    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:57.380423    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:57.380431    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:57.380440    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:57.380447    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:57.380453    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:57.380460    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:57.380469    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:57.380475    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:57.380483    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:57.380490    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:57.380497    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:57.380505    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:57.380512    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:59.382508    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 6
	I0831 16:21:59.382524    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:59.382614    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:21:59.383458    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:21:59.383511    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:59.383533    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:59.383544    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:59.383552    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:59.383559    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:59.383567    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:59.383575    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:59.383580    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:59.383591    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:59.383600    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:59.383607    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:59.383615    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:59.383623    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:59.383636    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:59.383644    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:59.383652    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:59.383669    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:59.383683    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:01.384568    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 7
	I0831 16:22:01.384603    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:01.384647    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:01.385411    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:01.385449    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:01.385458    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:01.385482    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:01.385492    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:01.385505    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:01.385513    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:01.385529    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:01.385544    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:01.385552    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:01.385566    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:01.385574    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:01.385588    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:01.385599    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:01.385614    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:01.385628    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:01.385636    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:01.385645    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:01.385654    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:03.386762    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 8
	I0831 16:22:03.386779    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:03.386837    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:03.387600    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:03.387648    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:03.387660    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:03.387675    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:03.387686    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:03.387693    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:03.387699    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:03.387715    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:03.387729    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:03.387737    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:03.387744    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:03.387760    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:03.387775    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:03.387784    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:03.387800    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:03.387818    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:03.387831    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:03.387847    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:03.387860    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:05.388966    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 9
	I0831 16:22:05.388989    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:05.389027    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:05.389793    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:05.389834    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:05.389851    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:05.389863    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:05.389877    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:05.389904    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:05.389918    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:05.389944    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:05.389972    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:05.389978    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:05.389987    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:05.389996    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:05.390004    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:05.390011    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:05.390017    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:05.390033    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:05.390045    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:05.390054    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:05.390060    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:07.390109    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 10
	I0831 16:22:07.390127    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:07.390190    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:07.390980    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:07.391031    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:07.391046    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:07.391057    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:07.391063    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:07.391070    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:07.391076    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:07.391082    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:07.391090    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:07.391103    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:07.391112    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:07.391119    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:07.391125    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:07.391131    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:07.391139    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:07.391146    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:07.391158    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:07.391174    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:07.391183    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:09.391235    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 11
	I0831 16:22:09.391247    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:09.391313    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:09.392079    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:09.392119    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:09.392130    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:09.392139    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:09.392145    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:09.392165    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:09.392178    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:09.392187    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:09.392195    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:09.392222    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:09.392234    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:09.392242    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:09.392250    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:09.392263    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:09.392271    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:09.392282    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:09.392290    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:09.392297    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:09.392304    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:11.393053    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 12
	I0831 16:22:11.393065    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:11.393134    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:11.393902    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:11.393956    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:11.393967    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:11.393987    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:11.393999    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:11.394006    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:11.394013    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:11.394022    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:11.394029    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:11.394038    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:11.394045    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:11.394053    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:11.394060    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:11.394067    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:11.394088    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:11.394101    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:11.394108    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:11.394116    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:11.394124    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:13.395634    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 13
	I0831 16:22:13.395644    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:13.395717    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:13.396496    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:13.396540    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:13.396557    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:13.396566    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:13.396586    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:13.396599    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:13.396607    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:13.396615    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:13.396632    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:13.396646    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:13.396654    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:13.396672    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:13.396688    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:13.396701    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:13.396711    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:13.396717    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:13.396732    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:13.396752    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:13.396763    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:15.397195    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 14
	I0831 16:22:15.397208    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:15.397330    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:15.398318    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:15.398371    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:15.398381    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:15.398390    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:15.398401    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:15.398417    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:15.398427    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:15.398435    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:15.398443    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:15.398452    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:15.398460    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:15.398475    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:15.398488    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:15.398496    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:15.398509    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:15.398525    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:15.398534    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:15.398543    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:15.398552    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:17.398854    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 15
	I0831 16:22:17.398866    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:17.398980    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:17.399924    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:17.399971    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:17.399986    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:17.399999    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:17.400007    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:17.400041    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:17.400058    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:17.400067    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:17.400075    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:17.400083    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:17.400089    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:17.400097    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:17.400104    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:17.400111    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:17.400121    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:17.400141    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:17.400163    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:17.400174    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:17.400183    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:19.402140    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 16
	I0831 16:22:19.402154    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:19.402217    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:19.402992    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:19.403033    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:19.403041    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:19.403049    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:19.403056    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:19.403063    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:19.403071    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:19.403078    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:19.403087    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:19.403092    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:19.403099    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:19.403105    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:19.403110    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:19.403118    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:19.403124    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:19.403129    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:19.403136    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:19.403143    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:19.403151    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:21.405208    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 17
	I0831 16:22:21.405221    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:21.405290    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:21.406073    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:21.406124    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:21.406136    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:21.406149    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:21.406158    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:21.406166    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:21.406176    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:21.406183    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:21.406191    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:21.406198    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:21.406206    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:21.406215    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:21.406223    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:21.406239    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:21.406253    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:21.406261    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:21.406269    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:21.406276    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:21.406284    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:23.407388    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 18
	I0831 16:22:23.407399    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:23.407460    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:23.408280    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:23.408312    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:23.408319    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:23.408341    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:23.408349    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:23.408360    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:23.408367    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:23.408374    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:23.408383    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:23.408392    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:23.408397    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:23.408406    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:23.408413    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:23.408421    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:23.408442    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:23.408456    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:23.408463    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:23.408472    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:23.408481    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:25.409913    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 19
	I0831 16:22:25.409925    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:25.409948    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:25.410877    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:25.410934    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:25.410945    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:25.410952    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:25.410959    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:25.410972    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:25.410980    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:25.410988    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:25.410998    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:25.411010    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:25.411021    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:25.411029    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:25.411048    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:25.411061    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:25.411072    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:25.411081    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:25.411090    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:25.411108    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:25.411124    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:27.411529    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 20
	I0831 16:22:27.411541    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:27.411579    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:27.412383    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:27.412419    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:27.412430    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:27.412452    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:27.412459    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:27.412466    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:27.412474    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:27.412481    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:27.412487    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:27.412494    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:27.412502    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:27.412515    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:27.412527    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:27.412537    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:27.412545    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:27.412552    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:27.412560    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:27.412567    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:27.412573    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:29.412664    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 21
	I0831 16:22:29.412678    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:29.412725    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:29.413560    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:29.413568    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:29.413577    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:29.413583    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:29.413598    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:29.413605    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:29.413611    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:29.413618    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:29.413635    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:29.413646    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:29.413655    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:29.413664    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:29.413684    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:29.413697    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:29.413709    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:29.413716    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:29.413724    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:29.413731    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:29.413738    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:31.414177    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 22
	I0831 16:22:31.414192    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:31.414289    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:31.415115    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:31.415130    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:31.415139    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:31.415148    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:31.415157    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:31.415163    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:31.415180    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:31.415192    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:31.415207    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:31.415220    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:31.415233    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:31.415241    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:31.415254    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:31.415262    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:31.415270    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:31.415278    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:31.415287    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:31.415297    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:31.415310    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:33.417312    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 23
	I0831 16:22:33.417325    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:33.417378    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:33.418172    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:33.418219    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:33.418232    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:33.418245    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:33.418254    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:33.418268    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:33.418278    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:33.418291    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:33.418298    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:33.418306    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:33.418313    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:33.418320    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:33.418326    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:33.418332    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:33.418337    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:33.418343    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:33.418361    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:33.418372    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:33.418381    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:35.418465    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 24
	I0831 16:22:35.418477    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:35.418555    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:35.419352    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:35.419407    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:35.419425    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:35.419438    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:35.419449    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:35.419464    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:35.419476    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:35.419489    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:35.419499    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:35.419507    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:35.419514    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:35.419527    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:35.419540    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:35.419555    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:35.419567    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:35.419576    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:35.419584    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:35.419591    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:35.419599    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:37.420281    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 25
	I0831 16:22:37.420300    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:37.420367    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:37.421263    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:37.421306    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:37.421317    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:37.421330    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:37.421349    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:37.421358    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:37.421367    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:37.421375    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:37.421384    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:37.421390    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:37.421398    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:37.421411    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:37.421420    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:37.421429    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:37.421438    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:37.421448    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:37.421456    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:37.421463    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:37.421471    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:39.421726    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 26
	I0831 16:22:39.421738    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:39.421826    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:39.422611    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:39.422666    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:39.422682    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:39.422693    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:39.422701    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:39.422723    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:39.422737    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:39.422747    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:39.422758    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:39.422772    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:39.422787    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:39.422801    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:39.422811    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:39.422821    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:39.422829    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:39.422844    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:39.422859    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:39.422914    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:39.422921    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:41.423228    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 27
	I0831 16:22:41.423240    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:41.423294    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:41.424048    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:41.424093    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:41.424104    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:41.424112    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:41.424119    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:41.424135    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:41.424148    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:41.424158    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:41.424165    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:41.424174    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:41.424181    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:41.424192    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:41.424204    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:41.424211    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:41.424219    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:41.424226    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:41.424236    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:41.424244    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:41.424253    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:43.425165    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 28
	I0831 16:22:43.425583    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:43.425608    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:43.426066    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:43.426086    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:43.426107    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:43.426121    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:43.426131    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:43.426140    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:43.426158    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:43.426174    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:43.426183    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:43.426189    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:43.426237    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:43.426263    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:43.426285    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:43.426296    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:43.426307    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:43.426314    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:43.426334    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:43.426341    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:43.426365    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:45.426336    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Attempt 29
	I0831 16:22:45.426358    6185 main.go:141] libmachine: (docker-flags-031000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:22:45.426404    6185 main.go:141] libmachine: (docker-flags-031000) DBG | hyperkit pid from json: 6251
	I0831 16:22:45.427185    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Searching for 9e:ad:83:7e:88:14 in /var/db/dhcpd_leases ...
	I0831 16:22:45.427242    6185 main.go:141] libmachine: (docker-flags-031000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:22:45.427253    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:22:45.427265    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:22:45.427272    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:22:45.427289    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:22:45.427298    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:22:45.427307    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:22:45.427313    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:22:45.427320    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:22:45.427326    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:22:45.427333    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:22:45.427370    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:22:45.427387    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:22:45.427395    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:22:45.427401    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:22:45.427407    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:22:45.427413    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:22:45.427421    6185 main.go:141] libmachine: (docker-flags-031000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:22:47.428191    6185 client.go:171] duration metric: took 1m1.072919787s to LocalClient.Create
	I0831 16:22:49.429689    6185 start.go:128] duration metric: took 1m3.127588238s to createHost
	I0831 16:22:49.429702    6185 start.go:83] releasing machines lock for "docker-flags-031000", held for 1m3.127704178s
	W0831 16:22:49.429788    6185 out.go:270] * Failed to start hyperkit VM. Running "minikube delete -p docker-flags-031000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 9e:ad:83:7e:88:14
	* Failed to start hyperkit VM. Running "minikube delete -p docker-flags-031000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 9e:ad:83:7e:88:14
	I0831 16:22:49.493253    6185 out.go:201] 
	W0831 16:22:49.514328    6185 out.go:270] X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 9e:ad:83:7e:88:14
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 9e:ad:83:7e:88:14
	W0831 16:22:49.514341    6185 out.go:270] * 
	* 
	W0831 16:22:49.514926    6185 out.go:293] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0831 16:22:49.577447    6185 out.go:201] 

                                                
                                                
** /stderr **
docker_test.go:53: failed to start minikube with args: "out/minikube-darwin-amd64 start -p docker-flags-031000 --cache-images=false --memory=2048 --install-addons=false --wait=false --docker-env=FOO=BAR --docker-env=BAZ=BAT --docker-opt=debug --docker-opt=icc=true --alsologtostderr -v=5 --driver=hyperkit " : exit status 80
docker_test.go:56: (dbg) Run:  out/minikube-darwin-amd64 -p docker-flags-031000 ssh "sudo systemctl show docker --property=Environment --no-pager"
docker_test.go:56: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p docker-flags-031000 ssh "sudo systemctl show docker --property=Environment --no-pager": exit status 50 (183.028397ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to DRV_CP_ENDPOINT: Unable to get control-plane node docker-flags-031000 endpoint: failed to lookup ip for ""
	* Suggestion: 
	
	    Recreate the cluster by running:
	    minikube delete <no value>
	    minikube start <no value>

                                                
                                                
** /stderr **
docker_test.go:58: failed to 'systemctl show docker' inside minikube. args "out/minikube-darwin-amd64 -p docker-flags-031000 ssh \"sudo systemctl show docker --property=Environment --no-pager\"": exit status 50
docker_test.go:63: expected env key/value "FOO=BAR" to be passed to minikube's docker and be included in: *"\n\n"*.
docker_test.go:63: expected env key/value "BAZ=BAT" to be passed to minikube's docker and be included in: *"\n\n"*.
docker_test.go:67: (dbg) Run:  out/minikube-darwin-amd64 -p docker-flags-031000 ssh "sudo systemctl show docker --property=ExecStart --no-pager"
docker_test.go:67: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p docker-flags-031000 ssh "sudo systemctl show docker --property=ExecStart --no-pager": exit status 50 (171.058174ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to DRV_CP_ENDPOINT: Unable to get control-plane node docker-flags-031000 endpoint: failed to lookup ip for ""
	* Suggestion: 
	
	    Recreate the cluster by running:
	    minikube delete <no value>
	    minikube start <no value>

                                                
                                                
** /stderr **
docker_test.go:69: failed on the second 'systemctl show docker' inside minikube. args "out/minikube-darwin-amd64 -p docker-flags-031000 ssh \"sudo systemctl show docker --property=ExecStart --no-pager\"": exit status 50
docker_test.go:73: expected "out/minikube-darwin-amd64 -p docker-flags-031000 ssh \"sudo systemctl show docker --property=ExecStart --no-pager\"" output to have include *--debug* . output: "\n\n"
panic.go:626: *** TestDockerFlags FAILED at 2024-08-31 16:22:50.045115 -0700 PDT m=+4667.240321439
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p docker-flags-031000 -n docker-flags-031000
helpers_test.go:240: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p docker-flags-031000 -n docker-flags-031000: exit status 7 (89.19091ms)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0831 16:22:50.132137    6278 status.go:352] failed to get driver ip: getting IP: IP address is not set
	E0831 16:22:50.132157    6278 status.go:249] status error: getting IP: IP address is not set

                                                
                                                
** /stderr **
helpers_test.go:240: status error: exit status 7 (may be ok)
helpers_test.go:242: "docker-flags-031000" host is not running, skipping log retrieval (state="Error")
helpers_test.go:176: Cleaning up "docker-flags-031000" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 delete -p docker-flags-031000
E0831 16:22:52.740140    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
helpers_test.go:179: (dbg) Done: out/minikube-darwin-amd64 delete -p docker-flags-031000: (5.256168143s)
--- FAIL: TestDockerFlags (252.12s)

                                                
                                    
x
+
TestForceSystemdFlag (252.02s)

                                                
                                                
=== RUN   TestForceSystemdFlag
=== PAUSE TestForceSystemdFlag

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdFlag
docker_test.go:91: (dbg) Run:  out/minikube-darwin-amd64 start -p force-systemd-flag-286000 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=hyperkit 
E0831 16:17:52.738810    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
docker_test.go:91: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p force-systemd-flag-286000 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=hyperkit : exit status 80 (4m6.436596765s)

                                                
                                                
-- stdout --
	* [force-systemd-flag-286000] minikube v1.33.1 on Darwin 14.6.1
	  - MINIKUBE_LOCATION=18943
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting "force-systemd-flag-286000" primary control-plane node in "force-systemd-flag-286000" cluster
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	* Deleting "force-systemd-flag-286000" in hyperkit ...
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 16:17:40.108670    6150 out.go:345] Setting OutFile to fd 1 ...
	I0831 16:17:40.108859    6150 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 16:17:40.108864    6150 out.go:358] Setting ErrFile to fd 2...
	I0831 16:17:40.108868    6150 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 16:17:40.109050    6150 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 16:17:40.110479    6150 out.go:352] Setting JSON to false
	I0831 16:17:40.132984    6150 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":4631,"bootTime":1725141629,"procs":442,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.6.1","kernelVersion":"23.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0831 16:17:40.133078    6150 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0831 16:17:40.154802    6150 out.go:177] * [force-systemd-flag-286000] minikube v1.33.1 on Darwin 14.6.1
	I0831 16:17:40.197649    6150 out.go:177]   - MINIKUBE_LOCATION=18943
	I0831 16:17:40.197680    6150 notify.go:220] Checking for updates...
	I0831 16:17:40.239534    6150 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 16:17:40.260516    6150 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0831 16:17:40.281352    6150 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0831 16:17:40.302485    6150 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 16:17:40.325516    6150 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0831 16:17:40.346860    6150 config.go:182] Loaded profile config "force-systemd-env-257000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 16:17:40.346959    6150 driver.go:392] Setting default libvirt URI to qemu:///system
	I0831 16:17:40.375547    6150 out.go:177] * Using the hyperkit driver based on user configuration
	I0831 16:17:40.417358    6150 start.go:297] selected driver: hyperkit
	I0831 16:17:40.417373    6150 start.go:901] validating driver "hyperkit" against <nil>
	I0831 16:17:40.417386    6150 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0831 16:17:40.420430    6150 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 16:17:40.420545    6150 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18943-957/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0831 16:17:40.428989    6150 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0831 16:17:40.432904    6150 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:17:40.432926    6150 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0831 16:17:40.432980    6150 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0831 16:17:40.433169    6150 start_flags.go:929] Wait components to verify : map[apiserver:true system_pods:true]
	I0831 16:17:40.433230    6150 cni.go:84] Creating CNI manager for ""
	I0831 16:17:40.433255    6150 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0831 16:17:40.433263    6150 start_flags.go:319] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0831 16:17:40.433329    6150 start.go:340] cluster config:
	{Name:force-systemd-flag-286000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 ClusterName:force-systemd-flag-286000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:clus
ter.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 16:17:40.433411    6150 iso.go:125] acquiring lock: {Name:mk6e91575b208577856769ef01f8e000bc57c787 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 16:17:40.475503    6150 out.go:177] * Starting "force-systemd-flag-286000" primary control-plane node in "force-systemd-flag-286000" cluster
	I0831 16:17:40.496369    6150 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 16:17:40.496410    6150 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4
	I0831 16:17:40.496427    6150 cache.go:56] Caching tarball of preloaded images
	I0831 16:17:40.496544    6150 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 16:17:40.496554    6150 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 16:17:40.496636    6150 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/force-systemd-flag-286000/config.json ...
	I0831 16:17:40.496657    6150 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/force-systemd-flag-286000/config.json: {Name:mk33b862058ee7fc5dce75b266dcd7d82b9c82e6 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 16:17:40.497028    6150 start.go:360] acquireMachinesLock for force-systemd-flag-286000: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 16:18:37.476606    6150 start.go:364] duration metric: took 56.979187081s to acquireMachinesLock for "force-systemd-flag-286000"
	I0831 16:18:37.476646    6150 start.go:93] Provisioning new machine with config: &{Name:force-systemd-flag-286000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kuberne
tesConfig:{KubernetesVersion:v1.31.0 ClusterName:force-systemd-flag-286000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: Disable
Optimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 16:18:37.476693    6150 start.go:125] createHost starting for "" (driver="hyperkit")
	I0831 16:18:37.498124    6150 out.go:235] * Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	I0831 16:18:37.498335    6150 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:18:37.498375    6150 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:18:37.507567    6150 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53698
	I0831 16:18:37.507908    6150 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:18:37.508311    6150 main.go:141] libmachine: Using API Version  1
	I0831 16:18:37.508319    6150 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:18:37.508577    6150 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:18:37.508685    6150 main.go:141] libmachine: (force-systemd-flag-286000) Calling .GetMachineName
	I0831 16:18:37.508774    6150 main.go:141] libmachine: (force-systemd-flag-286000) Calling .DriverName
	I0831 16:18:37.508878    6150 start.go:159] libmachine.API.Create for "force-systemd-flag-286000" (driver="hyperkit")
	I0831 16:18:37.508900    6150 client.go:168] LocalClient.Create starting
	I0831 16:18:37.508933    6150 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem
	I0831 16:18:37.508988    6150 main.go:141] libmachine: Decoding PEM data...
	I0831 16:18:37.509007    6150 main.go:141] libmachine: Parsing certificate...
	I0831 16:18:37.509066    6150 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem
	I0831 16:18:37.509104    6150 main.go:141] libmachine: Decoding PEM data...
	I0831 16:18:37.509116    6150 main.go:141] libmachine: Parsing certificate...
	I0831 16:18:37.509134    6150 main.go:141] libmachine: Running pre-create checks...
	I0831 16:18:37.509143    6150 main.go:141] libmachine: (force-systemd-flag-286000) Calling .PreCreateCheck
	I0831 16:18:37.509211    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:37.509399    6150 main.go:141] libmachine: (force-systemd-flag-286000) Calling .GetConfigRaw
	I0831 16:18:37.539991    6150 main.go:141] libmachine: Creating machine...
	I0831 16:18:37.540001    6150 main.go:141] libmachine: (force-systemd-flag-286000) Calling .Create
	I0831 16:18:37.540101    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:37.540236    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | I0831 16:18:37.540093    6169 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 16:18:37.540281    6150 main.go:141] libmachine: (force-systemd-flag-286000) Downloading /Users/jenkins/minikube-integration/18943-957/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso...
	I0831 16:18:37.975883    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | I0831 16:18:37.975763    6169 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/id_rsa...
	I0831 16:18:38.165721    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | I0831 16:18:38.165629    6169 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/force-systemd-flag-286000.rawdisk...
	I0831 16:18:38.165733    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Writing magic tar header
	I0831 16:18:38.165746    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Writing SSH key tar header
	I0831 16:18:38.166083    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | I0831 16:18:38.166045    6169 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000 ...
	I0831 16:18:38.528419    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:38.528439    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/hyperkit.pid
	I0831 16:18:38.528450    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Using UUID 8f9d4e29-24d6-4c83-b7b0-90eda4fec7d1
	I0831 16:18:38.554020    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Generated MAC 4e:e1:7b:7c:72:41
	I0831 16:18:38.554045    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-flag-286000
	I0831 16:18:38.554083    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:38 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"8f9d4e29-24d6-4c83-b7b0-90eda4fec7d1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]st
ring(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 16:18:38.554132    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:38 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"8f9d4e29-24d6-4c83-b7b0-90eda4fec7d1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]st
ring(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 16:18:38.554174    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:38 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/hyperkit.pid", "-c", "2", "-m", "2048M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "8f9d4e29-24d6-4c83-b7b0-90eda4fec7d1", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/force-systemd-flag-286000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-sy
stemd-flag-286000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-flag-286000"}
	I0831 16:18:38.554215    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:38 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/hyperkit.pid -c 2 -m 2048M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 8f9d4e29-24d6-4c83-b7b0-90eda4fec7d1 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/force-systemd-flag-286000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/bzimage,/Users/jenkins/minikube-integration/
18943-957/.minikube/machines/force-systemd-flag-286000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-flag-286000"
	I0831 16:18:38.554231    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:38 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 16:18:38.557138    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:38 DEBUG: hyperkit: Pid is 6184
	I0831 16:18:38.557569    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 0
	I0831 16:18:38.557585    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:38.557724    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:18:38.558650    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:18:38.558691    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:38.558719    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:38.558736    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:38.558759    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:38.558770    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:38.558783    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:38.558817    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:38.558844    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:38.558866    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:38.558891    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:38.558912    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:38.558936    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:38.558954    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:38.558972    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:38.558986    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:38.559001    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:38.559013    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:38.559029    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:38.564792    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:38 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 16:18:38.572639    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:38 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 16:18:38.573563    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:38 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:18:38.573592    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:38 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:18:38.573604    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:38 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:18:38.573616    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:38 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:18:38.948747    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:38 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 16:18:38.948767    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:38 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 16:18:39.063832    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:39 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:18:39.063854    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:39 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:18:39.063879    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:39 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:18:39.063888    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:39 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:18:39.064725    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:39 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 16:18:39.064736    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:39 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 16:18:40.559575    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 1
	I0831 16:18:40.559593    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:40.559652    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:18:40.560473    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:18:40.560535    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:40.560545    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:40.560556    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:40.560565    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:40.560573    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:40.560578    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:40.560597    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:40.560611    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:40.560617    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:40.560624    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:40.560634    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:40.560641    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:40.560653    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:40.560660    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:40.560669    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:40.560675    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:40.560682    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:40.560689    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:42.561240    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 2
	I0831 16:18:42.561258    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:42.561338    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:18:42.562186    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:18:42.562247    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:42.562255    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:42.562264    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:42.562269    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:42.562294    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:42.562303    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:42.562312    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:42.562319    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:42.562327    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:42.562334    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:42.562353    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:42.562365    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:42.562383    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:42.562397    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:42.562420    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:42.562428    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:42.562444    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:42.562457    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:44.449568    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:44 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0831 16:18:44.449702    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:44 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0831 16:18:44.449711    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:44 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0831 16:18:44.469895    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:18:44 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0831 16:18:44.563289    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 3
	I0831 16:18:44.563317    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:44.563537    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:18:44.565037    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:18:44.565148    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:44.565168    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:44.565188    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:44.565201    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:44.565253    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:44.565269    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:44.565280    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:44.565293    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:44.565306    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:44.565315    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:44.565335    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:44.565350    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:44.565370    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:44.565402    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:44.565412    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:44.565421    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:44.565433    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:44.565444    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:46.566508    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 4
	I0831 16:18:46.566537    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:46.566639    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:18:46.567431    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:18:46.567489    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:46.567498    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:46.567506    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:46.567515    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:46.567544    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:46.567552    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:46.567559    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:46.567568    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:46.567575    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:46.567584    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:46.567590    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:46.567599    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:46.567607    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:46.567620    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:46.567635    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:46.567649    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:46.567661    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:46.567669    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:48.568181    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 5
	I0831 16:18:48.568197    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:48.568251    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:18:48.569035    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:18:48.569093    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:48.569105    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:48.569113    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:48.569122    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:48.569131    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:48.569136    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:48.569152    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:48.569166    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:48.569179    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:48.569186    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:48.569193    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:48.569201    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:48.569211    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:48.569225    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:48.569232    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:48.569239    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:48.569254    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:48.569267    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:50.571282    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 6
	I0831 16:18:50.571298    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:50.571355    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:18:50.572175    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:18:50.572228    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:50.572238    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:50.572245    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:50.572251    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:50.572269    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:50.572281    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:50.572290    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:50.572300    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:50.572318    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:50.572332    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:50.572342    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:50.572350    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:50.572367    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:50.572378    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:50.572386    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:50.572394    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:50.572401    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:50.572417    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:52.572541    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 7
	I0831 16:18:52.572563    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:52.572674    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:18:52.573488    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:18:52.573543    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:52.573556    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:52.573571    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:52.573588    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:52.573597    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:52.573609    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:52.573620    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:52.573634    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:52.573646    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:52.573661    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:52.573674    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:52.573682    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:52.573690    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:52.573703    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:52.573712    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:52.573719    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:52.573727    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:52.573735    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:54.574649    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 8
	I0831 16:18:54.574662    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:54.574717    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:18:54.575598    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:18:54.575639    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:54.575649    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:54.575667    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:54.575674    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:54.575681    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:54.575687    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:54.575701    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:54.575712    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:54.575726    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:54.575735    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:54.575742    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:54.575752    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:54.575759    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:54.575770    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:54.575779    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:54.575788    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:54.575794    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:54.575817    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:56.577812    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 9
	I0831 16:18:56.577826    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:56.577886    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:18:56.578673    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:18:56.578718    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:56.578728    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:56.578747    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:56.578757    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:56.578777    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:56.578785    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:56.578792    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:56.578801    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:56.578809    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:56.578817    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:56.578823    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:56.578831    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:56.578839    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:56.578847    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:56.578854    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:56.578866    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:56.578874    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:56.578883    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:58.580886    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 10
	I0831 16:18:58.580898    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:58.580976    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:18:58.581796    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:18:58.581842    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:58.581854    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:58.581884    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:58.581902    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:58.581909    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:58.581916    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:58.581922    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:58.581930    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:58.581940    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:58.581947    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:58.581955    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:58.581962    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:58.581971    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:58.581978    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:58.581984    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:58.581991    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:58.581999    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:58.582007    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:00.583016    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 11
	I0831 16:19:00.583056    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:00.583088    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:00.583921    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:19:00.583959    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:00.583970    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:00.583980    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:00.583994    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:00.584001    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:00.584008    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:00.584017    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:00.584028    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:00.584035    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:00.584043    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:00.584051    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:00.584058    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:00.584066    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:00.584085    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:00.584097    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:00.584109    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:00.584118    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:00.584126    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:02.586161    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 12
	I0831 16:19:02.586175    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:02.586226    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:02.587210    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:19:02.587282    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:02.587326    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:02.587336    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:02.587347    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:02.587360    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:02.587369    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:02.587376    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:02.587385    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:02.587410    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:02.587445    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:02.587455    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:02.587463    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:02.587470    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:02.587487    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:02.587500    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:02.587516    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:02.587527    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:02.587538    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:04.588990    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 13
	I0831 16:19:04.589012    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:04.589093    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:04.589845    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:19:04.589892    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:04.589915    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:04.589931    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:04.589943    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:04.589949    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:04.589958    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:04.589966    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:04.589974    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:04.589981    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:04.589990    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:04.590008    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:04.590019    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:04.590028    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:04.590037    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:04.590044    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:04.590052    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:04.590068    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:04.590077    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:06.592098    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 14
	I0831 16:19:06.592113    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:06.592191    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:06.593052    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:19:06.593089    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:06.593110    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:06.593135    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:06.593149    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:06.593178    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:06.593192    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:06.593206    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:06.593215    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:06.593222    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:06.593229    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:06.593236    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:06.593243    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:06.593249    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:06.593272    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:06.593284    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:06.593301    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:06.593312    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:06.593321    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:08.594250    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 15
	I0831 16:19:08.594263    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:08.594320    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:08.595114    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:19:08.595152    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:08.595169    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:08.595190    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:08.595202    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:08.595209    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:08.595218    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:08.595224    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:08.595232    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:08.595248    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:08.595260    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:08.595272    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:08.595282    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:08.595290    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:08.595305    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:08.595316    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:08.595323    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:08.595330    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:08.595338    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:10.597339    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 16
	I0831 16:19:10.597353    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:10.597426    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:10.598241    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:19:10.598289    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:10.598298    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:10.598317    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:10.598326    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:10.598334    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:10.598341    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:10.598347    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:10.598354    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:10.598360    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:10.598366    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:10.598383    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:10.598399    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:10.598409    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:10.598418    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:10.598425    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:10.598431    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:10.598439    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:10.598445    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:12.599925    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 17
	I0831 16:19:12.599940    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:12.599995    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:12.600825    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:19:12.600876    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:12.600889    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:12.600898    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:12.600907    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:12.600916    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:12.600923    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:12.600931    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:12.600938    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:12.600946    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:12.600961    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:12.600972    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:12.600982    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:12.600991    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:12.600998    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:12.601006    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:12.601013    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:12.601019    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:12.601028    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:14.602393    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 18
	I0831 16:19:14.602408    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:14.602468    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:14.603281    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:19:14.603314    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:14.603330    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:14.603354    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:14.603381    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:14.603394    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:14.603407    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:14.603420    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:14.603437    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:14.603449    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:14.603457    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:14.603467    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:14.603474    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:14.603482    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:14.603490    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:14.603498    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:14.603507    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:14.603515    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:14.603523    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:16.605469    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 19
	I0831 16:19:16.605481    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:16.605561    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:16.606360    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:19:16.606442    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:16.606477    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:16.606493    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:16.606502    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:16.606510    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:16.606516    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:16.606529    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:16.606542    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:16.606550    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:16.606558    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:16.606565    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:16.606571    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:16.606578    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:16.606584    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:16.606605    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:16.606616    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:16.606624    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:16.606639    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:18.608601    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 20
	I0831 16:19:18.608613    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:18.608735    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:18.609506    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:19:18.609554    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:18.609562    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:18.609571    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:18.609580    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:18.609586    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:18.609592    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:18.609610    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:18.609625    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:18.609637    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:18.609646    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:18.609653    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:18.609661    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:18.609669    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:18.609675    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:18.609694    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:18.609707    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:18.609715    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:18.609724    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:20.611733    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 21
	I0831 16:19:20.611748    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:20.611823    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:20.612610    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:19:20.612660    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:20.612670    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:20.612679    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:20.612685    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:20.612691    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:20.612700    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:20.612708    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:20.612726    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:20.612734    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:20.612741    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:20.612749    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:20.612755    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:20.612764    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:20.612775    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:20.612784    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:20.612791    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:20.612799    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:20.612810    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:22.614808    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 22
	I0831 16:19:22.614823    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:22.614898    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:22.615749    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:19:22.615785    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:22.615793    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:22.615802    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:22.615811    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:22.615819    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:22.615829    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:22.615837    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:22.615843    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:22.615860    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:22.615869    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:22.615886    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:22.615898    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:22.615907    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:22.615913    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:22.615919    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:22.615937    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:22.615952    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:22.615965    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:24.616257    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 23
	I0831 16:19:24.616271    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:24.616335    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:24.617117    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:19:24.617180    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:24.617190    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:24.617197    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:24.617206    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:24.617215    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:24.617221    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:24.617228    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:24.617234    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:24.617241    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:24.617249    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:24.617256    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:24.617274    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:24.617289    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:24.617303    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:24.617311    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:24.617320    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:24.617328    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:24.617336    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:26.619144    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 24
	I0831 16:19:26.619159    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:26.619223    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:26.620034    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:19:26.620055    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:26.620063    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:26.620072    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:26.620078    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:26.620085    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:26.620094    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:26.620101    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:26.620107    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:26.620114    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:26.620120    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:26.620127    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:26.620134    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:26.620147    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:26.620154    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:26.620161    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:26.620169    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:26.620175    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:26.620181    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:28.622262    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 25
	I0831 16:19:28.622275    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:28.622336    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:28.623115    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:19:28.623165    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:28.623176    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:28.623186    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:28.623195    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:28.623203    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:28.623225    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:28.623232    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:28.623238    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:28.623247    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:28.623254    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:28.623262    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:28.623269    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:28.623277    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:28.623283    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:28.623288    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:28.623309    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:28.623322    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:28.623334    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:30.625335    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 26
	I0831 16:19:30.625346    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:30.625445    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:30.626227    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:19:30.626262    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:30.626272    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:30.626285    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:30.626294    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:30.626328    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:30.626340    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:30.626347    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:30.626356    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:30.626371    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:30.626378    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:30.626384    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:30.626393    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:30.626408    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:30.626415    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:30.626424    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:30.626442    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:30.626454    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:30.626468    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:32.628453    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 27
	I0831 16:19:32.628465    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:32.628534    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:32.629346    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:19:32.629383    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:32.629392    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:32.629402    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:32.629410    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:32.629416    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:32.629422    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:32.629429    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:32.629436    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:32.629442    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:32.629458    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:32.629467    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:32.629488    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:32.629496    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:32.629504    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:32.629516    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:32.629524    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:32.629531    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:32.629540    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:34.629889    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 28
	I0831 16:19:34.629905    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:34.629976    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:34.630827    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:19:34.630876    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:34.630897    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:34.630909    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:34.630920    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:34.630928    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:34.630935    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:34.630949    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:34.630956    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:34.630964    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:34.630978    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:34.630988    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:34.630996    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:34.631005    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:34.631012    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:34.631020    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:34.631027    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:34.631043    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:34.631051    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:36.633084    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 29
	I0831 16:19:36.633097    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:36.633158    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:36.634248    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 4e:e1:7b:7c:72:41 in /var/db/dhcpd_leases ...
	I0831 16:19:36.634303    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:19:36.634314    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:19:36.634321    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:19:36.634327    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:19:36.634335    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:19:36.634343    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:19:36.634351    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:19:36.634358    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:19:36.634368    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:19:36.634374    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:19:36.634381    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:19:36.634389    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:19:36.634395    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:19:36.634402    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:19:36.634409    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:19:36.634416    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:19:36.634431    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:19:36.634445    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:19:38.636470    6150 client.go:171] duration metric: took 1m1.12715749s to LocalClient.Create
	I0831 16:19:40.638630    6150 start.go:128] duration metric: took 1m3.161495853s to createHost
	I0831 16:19:40.638665    6150 start.go:83] releasing machines lock for "force-systemd-flag-286000", held for 1m3.161613957s
	W0831 16:19:40.638702    6150 start.go:714] error starting host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 4e:e1:7b:7c:72:41
	I0831 16:19:40.639027    6150 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:19:40.639058    6150 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:19:40.648216    6150 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53714
	I0831 16:19:40.648615    6150 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:19:40.648981    6150 main.go:141] libmachine: Using API Version  1
	I0831 16:19:40.649002    6150 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:19:40.649277    6150 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:19:40.649622    6150 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:19:40.649644    6150 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:19:40.658170    6150 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53716
	I0831 16:19:40.658692    6150 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:19:40.659048    6150 main.go:141] libmachine: Using API Version  1
	I0831 16:19:40.659058    6150 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:19:40.659309    6150 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:19:40.659476    6150 main.go:141] libmachine: (force-systemd-flag-286000) Calling .GetState
	I0831 16:19:40.659624    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:40.659698    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:40.660720    6150 main.go:141] libmachine: (force-systemd-flag-286000) Calling .DriverName
	I0831 16:19:40.702053    6150 out.go:177] * Deleting "force-systemd-flag-286000" in hyperkit ...
	I0831 16:19:40.723063    6150 main.go:141] libmachine: (force-systemd-flag-286000) Calling .Remove
	I0831 16:19:40.723193    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:40.723207    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:40.723267    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:40.724199    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:40.724284    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | waiting for graceful shutdown
	I0831 16:19:41.725417    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:41.725529    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:41.726428    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | waiting for graceful shutdown
	I0831 16:19:42.728588    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:42.728673    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:42.730272    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | waiting for graceful shutdown
	I0831 16:19:43.730910    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:43.731002    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:43.731677    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | waiting for graceful shutdown
	I0831 16:19:44.732401    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:44.732456    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:44.732997    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | waiting for graceful shutdown
	I0831 16:19:45.735076    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:19:45.735153    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6184
	I0831 16:19:45.736172    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | sending sigkill
	I0831 16:19:45.736181    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	W0831 16:19:45.748606    6150 out.go:270] ! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 4e:e1:7b:7c:72:41
	! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 4e:e1:7b:7c:72:41
	I0831 16:19:45.748629    6150 start.go:729] Will try again in 5 seconds ...
	I0831 16:19:45.757840    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:19:45 WARN : hyperkit: failed to read stderr: EOF
	I0831 16:19:45.757854    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:19:45 WARN : hyperkit: failed to read stdout: EOF
	I0831 16:19:50.749255    6150 start.go:360] acquireMachinesLock for force-systemd-flag-286000: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 16:20:43.497791    6150 start.go:364] duration metric: took 52.748169747s to acquireMachinesLock for "force-systemd-flag-286000"
	I0831 16:20:43.497813    6150 start.go:93] Provisioning new machine with config: &{Name:force-systemd-flag-286000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kuberne
tesConfig:{KubernetesVersion:v1.31.0 ClusterName:force-systemd-flag-286000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: Disable
Optimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 16:20:43.497875    6150 start.go:125] createHost starting for "" (driver="hyperkit")
	I0831 16:20:43.519159    6150 out.go:235] * Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	I0831 16:20:43.519243    6150 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:20:43.519261    6150 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:20:43.527893    6150 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53724
	I0831 16:20:43.528316    6150 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:20:43.528802    6150 main.go:141] libmachine: Using API Version  1
	I0831 16:20:43.528832    6150 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:20:43.529176    6150 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:20:43.529286    6150 main.go:141] libmachine: (force-systemd-flag-286000) Calling .GetMachineName
	I0831 16:20:43.529386    6150 main.go:141] libmachine: (force-systemd-flag-286000) Calling .DriverName
	I0831 16:20:43.529491    6150 start.go:159] libmachine.API.Create for "force-systemd-flag-286000" (driver="hyperkit")
	I0831 16:20:43.529520    6150 client.go:168] LocalClient.Create starting
	I0831 16:20:43.529547    6150 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem
	I0831 16:20:43.529596    6150 main.go:141] libmachine: Decoding PEM data...
	I0831 16:20:43.529613    6150 main.go:141] libmachine: Parsing certificate...
	I0831 16:20:43.529655    6150 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem
	I0831 16:20:43.529693    6150 main.go:141] libmachine: Decoding PEM data...
	I0831 16:20:43.529705    6150 main.go:141] libmachine: Parsing certificate...
	I0831 16:20:43.529718    6150 main.go:141] libmachine: Running pre-create checks...
	I0831 16:20:43.529724    6150 main.go:141] libmachine: (force-systemd-flag-286000) Calling .PreCreateCheck
	I0831 16:20:43.529799    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:43.529841    6150 main.go:141] libmachine: (force-systemd-flag-286000) Calling .GetConfigRaw
	I0831 16:20:43.561303    6150 main.go:141] libmachine: Creating machine...
	I0831 16:20:43.561314    6150 main.go:141] libmachine: (force-systemd-flag-286000) Calling .Create
	I0831 16:20:43.561420    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:43.561589    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | I0831 16:20:43.561408    6220 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 16:20:43.561629    6150 main.go:141] libmachine: (force-systemd-flag-286000) Downloading /Users/jenkins/minikube-integration/18943-957/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso...
	I0831 16:20:43.785646    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | I0831 16:20:43.785553    6220 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/id_rsa...
	I0831 16:20:43.827482    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | I0831 16:20:43.827412    6220 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/force-systemd-flag-286000.rawdisk...
	I0831 16:20:43.827492    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Writing magic tar header
	I0831 16:20:43.827502    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Writing SSH key tar header
	I0831 16:20:43.827869    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | I0831 16:20:43.827830    6220 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000 ...
	I0831 16:20:44.192725    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:44.192746    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/hyperkit.pid
	I0831 16:20:44.192756    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Using UUID 1ca91632-75bb-455f-916e-d3b69498764d
	I0831 16:20:44.220627    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Generated MAC 42:e7:3b:b0:b5:40
	I0831 16:20:44.220647    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-flag-286000
	I0831 16:20:44.220687    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:44 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"1ca91632-75bb-455f-916e-d3b69498764d", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]st
ring(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 16:20:44.220716    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:44 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"1ca91632-75bb-455f-916e-d3b69498764d", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]st
ring(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 16:20:44.220760    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:44 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/hyperkit.pid", "-c", "2", "-m", "2048M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "1ca91632-75bb-455f-916e-d3b69498764d", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/force-systemd-flag-286000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-sy
stemd-flag-286000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-flag-286000"}
	I0831 16:20:44.220796    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:44 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/hyperkit.pid -c 2 -m 2048M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 1ca91632-75bb-455f-916e-d3b69498764d -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/force-systemd-flag-286000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/bzimage,/Users/jenkins/minikube-integration/
18943-957/.minikube/machines/force-systemd-flag-286000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-flag-286000"
	I0831 16:20:44.220805    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:44 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 16:20:44.223680    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:44 DEBUG: hyperkit: Pid is 6221
	I0831 16:20:44.224203    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 0
	I0831 16:20:44.224219    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:44.224306    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:20:44.225338    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:20:44.225412    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:44.225455    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:44.225476    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:44.225526    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:44.225550    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:44.225564    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:44.225577    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:44.225600    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:44.225615    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:44.225630    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:44.225639    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:44.225646    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:44.225652    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:44.225661    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:44.225672    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:44.225684    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:44.225697    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:44.225709    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:44.231275    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:44 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 16:20:44.239794    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:44 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-flag-286000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 16:20:44.240489    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:44 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:20:44.240506    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:44 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:20:44.240514    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:44 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:20:44.240520    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:44 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:20:44.618588    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:44 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 16:20:44.618599    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:44 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 16:20:44.733838    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:44 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:20:44.733853    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:44 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:20:44.733870    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:44 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:20:44.733880    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:44 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:20:44.734427    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:44 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 16:20:44.734439    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:44 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 16:20:46.227204    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 1
	I0831 16:20:46.227220    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:46.227318    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:20:46.228121    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:20:46.228193    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:46.228204    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:46.228225    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:46.228233    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:46.228252    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:46.228274    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:46.228289    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:46.228298    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:46.228303    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:46.228363    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:46.228395    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:46.228403    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:46.228414    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:46.228421    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:46.228429    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:46.228453    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:46.228466    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:46.228483    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:48.230287    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 2
	I0831 16:20:48.230306    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:48.230346    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:20:48.231144    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:20:48.231203    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:48.231222    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:48.231230    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:48.231242    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:48.231251    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:48.231275    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:48.231284    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:48.231292    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:48.231298    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:48.231310    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:48.231323    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:48.231332    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:48.231340    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:48.231354    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:48.231364    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:48.231373    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:48.231380    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:48.231389    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:50.125165    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:50 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0831 16:20:50.125294    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:50 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0831 16:20:50.125302    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:50 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0831 16:20:50.147544    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | 2024/08/31 16:20:50 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0831 16:20:50.232645    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 3
	I0831 16:20:50.232674    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:50.232864    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:20:50.234318    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:20:50.234465    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:50.234495    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:50.234514    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:50.234526    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:50.234539    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:50.234552    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:50.234565    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:50.234580    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:50.234594    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:50.234611    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:50.234624    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:50.234640    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:50.234655    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:50.234671    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:50.234702    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:50.234727    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:50.234742    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:50.234758    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:52.236659    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 4
	I0831 16:20:52.236675    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:52.236756    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:20:52.237586    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:20:52.237660    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:52.237672    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:52.237688    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:52.237695    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:52.237702    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:52.237715    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:52.237726    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:52.237734    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:52.237742    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:52.237748    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:52.237755    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:52.237763    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:52.237768    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:52.237786    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:52.237797    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:52.237805    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:52.237813    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:52.237829    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:54.238226    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 5
	I0831 16:20:54.238237    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:54.238295    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:20:54.239106    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:20:54.239157    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:54.239170    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:54.239180    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:54.239198    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:54.239213    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:54.239225    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:54.239233    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:54.239242    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:54.239257    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:54.239270    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:54.239282    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:54.239296    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:54.239312    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:54.239324    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:54.239334    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:54.239342    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:54.239349    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:54.239356    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:56.240002    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 6
	I0831 16:20:56.240016    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:56.240090    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:20:56.240917    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:20:56.240965    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:56.240977    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:56.240993    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:56.241006    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:56.241015    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:56.241025    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:56.241032    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:56.241038    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:56.241045    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:56.241051    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:56.241059    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:56.241066    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:56.241073    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:56.241081    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:56.241089    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:56.241097    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:56.241102    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:56.241108    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:20:58.243144    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 7
	I0831 16:20:58.243163    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:20:58.243273    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:20:58.244089    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:20:58.244139    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:20:58.244150    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:20:58.244158    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:20:58.244164    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:20:58.244188    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:20:58.244208    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:20:58.244217    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:20:58.244225    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:20:58.244232    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:20:58.244241    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:20:58.244248    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:20:58.244256    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:20:58.244262    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:20:58.244270    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:20:58.244277    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:20:58.244285    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:20:58.244297    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:20:58.244306    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:00.245119    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 8
	I0831 16:21:00.245133    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:00.245204    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:21:00.246019    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:21:00.246062    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:00.246074    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:00.246082    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:00.246089    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:00.246099    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:00.246107    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:00.246115    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:00.246131    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:00.246148    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:00.246161    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:00.246179    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:00.246190    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:00.246198    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:00.246208    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:00.246216    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:00.246224    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:00.246231    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:00.246239    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:02.247147    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 9
	I0831 16:21:02.247163    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:02.247287    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:21:02.248124    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:21:02.248166    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:02.248181    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:02.248204    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:02.248213    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:02.248222    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:02.248229    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:02.248236    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:02.248244    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:02.248258    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:02.248270    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:02.248281    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:02.248289    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:02.248296    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:02.248302    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:02.248309    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:02.248317    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:02.248324    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:02.248331    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:04.250199    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 10
	I0831 16:21:04.250210    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:04.250265    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:21:04.251075    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:21:04.251118    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:04.251129    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:04.251138    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:04.251160    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:04.251181    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:04.251194    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:04.251207    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:04.251217    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:04.251224    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:04.251232    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:04.251239    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:04.251245    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:04.251253    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:04.251262    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:04.251269    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:04.251278    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:04.251286    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:04.251295    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:06.252943    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 11
	I0831 16:21:06.252958    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:06.253055    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:21:06.253890    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:21:06.253937    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:06.253950    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:06.253970    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:06.253986    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:06.253995    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:06.254003    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:06.254010    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:06.254043    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:06.254053    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:06.254061    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:06.254069    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:06.254077    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:06.254084    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:06.254091    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:06.254097    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:06.254104    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:06.254112    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:06.254123    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:08.256101    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 12
	I0831 16:21:08.256116    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:08.256199    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:21:08.257009    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:21:08.257060    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:08.257070    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:08.257099    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:08.257110    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:08.257122    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:08.257135    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:08.257150    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:08.257163    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:08.257180    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:08.257193    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:08.257201    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:08.257216    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:08.257224    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:08.257234    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:08.257240    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:08.257246    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:08.257267    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:08.257281    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:10.257359    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 13
	I0831 16:21:10.257370    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:10.257435    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:21:10.258220    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:21:10.258269    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:10.258282    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:10.258298    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:10.258309    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:10.258315    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:10.258337    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:10.258346    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:10.258360    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:10.258370    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:10.258378    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:10.258387    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:10.258400    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:10.258407    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:10.258414    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:10.258422    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:10.258430    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:10.258437    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:10.258445    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:12.260497    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 14
	I0831 16:21:12.260512    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:12.260580    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:21:12.261486    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:21:12.261529    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:12.261538    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:12.261547    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:12.261555    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:12.261563    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:12.261569    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:12.261584    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:12.261591    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:12.261599    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:12.261606    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:12.261615    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:12.261623    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:12.261630    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:12.261637    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:12.261651    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:12.261663    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:12.261671    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:12.261680    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:14.263750    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 15
	I0831 16:21:14.263763    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:14.263825    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:21:14.264696    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:21:14.264742    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:14.264764    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:14.264779    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:14.264789    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:14.264796    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:14.264805    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:14.264813    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:14.264831    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:14.264845    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:14.264853    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:14.264867    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:14.264875    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:14.264883    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:14.264899    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:14.264914    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:14.264937    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:14.264974    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:14.264993    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:16.266760    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 16
	I0831 16:21:16.266772    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:16.266839    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:21:16.267630    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:21:16.267688    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:16.267700    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:16.267717    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:16.267727    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:16.267744    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:16.267756    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:16.267764    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:16.267773    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:16.267785    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:16.267793    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:16.267800    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:16.267810    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:16.267818    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:16.267826    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:16.267833    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:16.267841    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:16.267848    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:16.267856    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:18.269886    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 17
	I0831 16:21:18.269901    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:18.269946    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:21:18.270844    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:21:18.270899    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:18.270910    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:18.270919    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:18.270926    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:18.270933    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:18.270940    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:18.270946    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:18.270953    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:18.270966    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:18.270974    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:18.270988    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:18.271001    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:18.271009    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:18.271017    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:18.271025    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:18.271031    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:18.271038    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:18.271047    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:20.271663    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 18
	I0831 16:21:20.271674    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:20.271732    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:21:20.272527    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:21:20.272565    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:20.272575    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:20.272590    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:20.272602    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:20.272610    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:20.272618    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:20.272634    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:20.272644    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:20.272651    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:20.272657    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:20.272670    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:20.272684    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:20.272695    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:20.272702    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:20.272710    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:20.272717    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:20.272738    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:20.272752    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:22.274774    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 19
	I0831 16:21:22.274785    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:22.274896    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:21:22.275707    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:21:22.275757    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:22.275775    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:22.275787    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:22.275794    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:22.275800    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:22.275807    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:22.275826    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:22.275832    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:22.275846    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:22.275863    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:22.275873    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:22.275882    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:22.275896    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:22.275911    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:22.275921    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:22.275929    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:22.275936    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:22.275945    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:24.276765    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 20
	I0831 16:21:24.276776    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:24.276856    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:21:24.277664    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:21:24.277717    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:24.277726    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:24.277747    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:24.277755    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:24.277763    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:24.277770    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:24.277786    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:24.277799    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:24.277808    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:24.277816    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:24.277822    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:24.277834    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:24.277855    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:24.277864    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:24.277871    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:24.277880    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:24.277887    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:24.277895    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:26.279926    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 21
	I0831 16:21:26.279936    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:26.280006    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:21:26.280793    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:21:26.280843    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:26.280857    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:26.280873    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:26.280883    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:26.280892    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:26.280900    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:26.280908    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:26.280916    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:26.280924    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:26.280932    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:26.280947    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:26.280958    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:26.280965    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:26.280972    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:26.280979    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:26.280986    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:26.280992    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:26.280999    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:28.281060    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 22
	I0831 16:21:28.281074    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:28.281147    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:21:28.281972    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:21:28.282024    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:28.282036    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:28.282053    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:28.282064    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:28.282081    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:28.282093    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:28.282102    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:28.282111    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:28.282122    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:28.282132    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:28.282146    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:28.282154    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:28.282167    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:28.282175    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:28.282191    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:28.282205    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:28.282221    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:28.282230    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:30.283172    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 23
	I0831 16:21:30.283183    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:30.283251    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:21:30.284031    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:21:30.284074    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:30.284083    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:30.284109    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:30.284129    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:30.284143    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:30.284152    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:30.284158    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:30.284166    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:30.284178    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:30.284184    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:30.284191    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:30.284197    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:30.284204    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:30.284210    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:30.284218    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:30.284228    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:30.284240    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:30.284249    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:32.286284    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 24
	I0831 16:21:32.286298    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:32.286365    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:21:32.287181    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:21:32.287225    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:32.287245    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:32.287253    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:32.287271    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:32.287285    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:32.287294    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:32.287302    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:32.287311    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:32.287320    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:32.287328    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:32.287334    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:32.287340    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:32.287347    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:32.287355    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:32.287366    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:32.287375    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:32.287382    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:32.287391    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:34.289388    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 25
	I0831 16:21:34.289408    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:34.289481    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:21:34.290344    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:21:34.290387    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:34.290395    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:34.290405    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:34.290415    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:34.290425    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:34.290437    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:34.290444    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:34.290452    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:34.290468    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:34.290483    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:34.290494    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:34.290500    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:34.290515    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:34.290528    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:34.290536    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:34.290544    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:34.290552    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:34.290559    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:36.291032    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 26
	I0831 16:21:36.291045    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:36.291129    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:21:36.291953    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:21:36.292012    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:36.292021    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:36.292034    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:36.292047    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:36.292059    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:36.292067    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:36.292074    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:36.292081    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:36.292087    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:36.292094    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:36.292102    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:36.292113    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:36.292122    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:36.292129    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:36.292136    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:36.292143    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:36.292149    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:36.292155    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:38.293355    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 27
	I0831 16:21:38.293368    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:38.293525    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:21:38.294348    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:21:38.294383    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:38.294393    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:38.294410    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:38.294417    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:38.294436    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:38.294442    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:38.294450    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:38.294465    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:38.294483    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:38.294494    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:38.294503    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:38.294510    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:38.294517    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:38.294525    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:38.294539    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:38.294548    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:38.294555    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:38.294567    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:40.294905    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 28
	I0831 16:21:40.294924    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:40.295403    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:21:40.295808    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:21:40.295872    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:40.295881    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:40.295908    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:40.295919    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:40.295927    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:40.295934    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:40.295952    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:40.295960    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:40.295989    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:40.296000    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:40.296008    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:40.296065    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:40.296092    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:40.296141    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:40.296167    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:40.296179    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:40.296189    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:40.296200    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:42.297264    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Attempt 29
	I0831 16:21:42.297284    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:21:42.297314    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | hyperkit pid from json: 6221
	I0831 16:21:42.298110    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Searching for 42:e7:3b:b0:b5:40 in /var/db/dhcpd_leases ...
	I0831 16:21:42.298166    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:21:42.298178    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:21:42.298188    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:21:42.298197    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:21:42.298205    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:21:42.298212    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:21:42.298221    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:21:42.298255    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:21:42.298285    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:21:42.298320    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:21:42.298328    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:21:42.298344    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:21:42.298358    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:21:42.298370    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:21:42.298378    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:21:42.298386    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:21:42.298394    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:21:42.298433    6150 main.go:141] libmachine: (force-systemd-flag-286000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:21:44.299339    6150 client.go:171] duration metric: took 1m0.769405786s to LocalClient.Create
	I0831 16:21:46.301454    6150 start.go:128] duration metric: took 1m2.803154974s to createHost
	I0831 16:21:46.301483    6150 start.go:83] releasing machines lock for "force-systemd-flag-286000", held for 1m2.803252458s
	W0831 16:21:46.301601    6150 out.go:270] * Failed to start hyperkit VM. Running "minikube delete -p force-systemd-flag-286000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 42:e7:3b:b0:b5:40
	* Failed to start hyperkit VM. Running "minikube delete -p force-systemd-flag-286000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 42:e7:3b:b0:b5:40
	I0831 16:21:46.364996    6150 out.go:201] 
	W0831 16:21:46.385856    6150 out.go:270] X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 42:e7:3b:b0:b5:40
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 42:e7:3b:b0:b5:40
	W0831 16:21:46.385866    6150 out.go:270] * 
	* 
	W0831 16:21:46.386484    6150 out.go:293] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0831 16:21:46.448763    6150 out.go:201] 

                                                
                                                
** /stderr **
docker_test.go:93: failed to start minikube with args: "out/minikube-darwin-amd64 start -p force-systemd-flag-286000 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=hyperkit " : exit status 80
docker_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 -p force-systemd-flag-286000 ssh "docker info --format {{.CgroupDriver}}"
docker_test.go:110: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p force-systemd-flag-286000 ssh "docker info --format {{.CgroupDriver}}": exit status 50 (186.74469ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to DRV_CP_ENDPOINT: Unable to get control-plane node force-systemd-flag-286000 endpoint: failed to lookup ip for ""
	* Suggestion: 
	
	    Recreate the cluster by running:
	    minikube delete <no value>
	    minikube start <no value>

                                                
                                                
** /stderr **
docker_test.go:112: failed to get docker cgroup driver. args "out/minikube-darwin-amd64 -p force-systemd-flag-286000 ssh \"docker info --format {{.CgroupDriver}}\"": exit status 50
docker_test.go:106: *** TestForceSystemdFlag FAILED at 2024-08-31 16:21:46.748951 -0700 PDT m=+4603.944572568
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p force-systemd-flag-286000 -n force-systemd-flag-286000
helpers_test.go:240: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p force-systemd-flag-286000 -n force-systemd-flag-286000: exit status 7 (78.759583ms)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0831 16:21:46.825753    6242 status.go:352] failed to get driver ip: getting IP: IP address is not set
	E0831 16:21:46.825777    6242 status.go:249] status error: getting IP: IP address is not set

                                                
                                                
** /stderr **
helpers_test.go:240: status error: exit status 7 (may be ok)
helpers_test.go:242: "force-systemd-flag-286000" host is not running, skipping log retrieval (state="Error")
helpers_test.go:176: Cleaning up "force-systemd-flag-286000" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 delete -p force-systemd-flag-286000
helpers_test.go:179: (dbg) Done: out/minikube-darwin-amd64 delete -p force-systemd-flag-286000: (5.250002378s)
--- FAIL: TestForceSystemdFlag (252.02s)

                                                
                                    
x
+
TestForceSystemdEnv (234.76s)

                                                
                                                
=== RUN   TestForceSystemdEnv
=== PAUSE TestForceSystemdEnv

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdEnv
docker_test.go:155: (dbg) Run:  out/minikube-darwin-amd64 start -p force-systemd-env-257000 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit 
docker_test.go:155: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p force-systemd-env-257000 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit : exit status 80 (3m49.175505588s)

                                                
                                                
-- stdout --
	* [force-systemd-env-257000] minikube v1.33.1 on Darwin 14.6.1
	  - MINIKUBE_LOCATION=18943
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=true
	* Using the hyperkit driver based on user configuration
	* Starting "force-systemd-env-257000" primary control-plane node in "force-systemd-env-257000" cluster
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	* Deleting "force-systemd-env-257000" in hyperkit ...
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 16:14:48.561557    6072 out.go:345] Setting OutFile to fd 1 ...
	I0831 16:14:48.561834    6072 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 16:14:48.561840    6072 out.go:358] Setting ErrFile to fd 2...
	I0831 16:14:48.561843    6072 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 16:14:48.562027    6072 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 16:14:48.563452    6072 out.go:352] Setting JSON to false
	I0831 16:14:48.585505    6072 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":4459,"bootTime":1725141629,"procs":436,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.6.1","kernelVersion":"23.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0831 16:14:48.585608    6072 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0831 16:14:48.607890    6072 out.go:177] * [force-systemd-env-257000] minikube v1.33.1 on Darwin 14.6.1
	I0831 16:14:48.651530    6072 out.go:177]   - MINIKUBE_LOCATION=18943
	I0831 16:14:48.651595    6072 notify.go:220] Checking for updates...
	I0831 16:14:48.693282    6072 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 16:14:48.714515    6072 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0831 16:14:48.735488    6072 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0831 16:14:48.756318    6072 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 16:14:48.778450    6072 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=true
	I0831 16:14:48.799728    6072 config.go:182] Loaded profile config "offline-docker-207000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 16:14:48.799806    6072 driver.go:392] Setting default libvirt URI to qemu:///system
	I0831 16:14:48.828504    6072 out.go:177] * Using the hyperkit driver based on user configuration
	I0831 16:14:48.870291    6072 start.go:297] selected driver: hyperkit
	I0831 16:14:48.870302    6072 start.go:901] validating driver "hyperkit" against <nil>
	I0831 16:14:48.870318    6072 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0831 16:14:48.873095    6072 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 16:14:48.873201    6072 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18943-957/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0831 16:14:48.881384    6072 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0831 16:14:48.885197    6072 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:14:48.885217    6072 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0831 16:14:48.885246    6072 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0831 16:14:48.885432    6072 start_flags.go:929] Wait components to verify : map[apiserver:true system_pods:true]
	I0831 16:14:48.885459    6072 cni.go:84] Creating CNI manager for ""
	I0831 16:14:48.885476    6072 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0831 16:14:48.885482    6072 start_flags.go:319] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0831 16:14:48.885545    6072 start.go:340] cluster config:
	{Name:force-systemd-env-257000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 ClusterName:force-systemd-env-257000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluste
r.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 16:14:48.885627    6072 iso.go:125] acquiring lock: {Name:mk6e91575b208577856769ef01f8e000bc57c787 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 16:14:48.927515    6072 out.go:177] * Starting "force-systemd-env-257000" primary control-plane node in "force-systemd-env-257000" cluster
	I0831 16:14:48.948336    6072 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 16:14:48.948359    6072 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4
	I0831 16:14:48.948373    6072 cache.go:56] Caching tarball of preloaded images
	I0831 16:14:48.948463    6072 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 16:14:48.948472    6072 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 16:14:48.948539    6072 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/force-systemd-env-257000/config.json ...
	I0831 16:14:48.948557    6072 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/force-systemd-env-257000/config.json: {Name:mk8888fe2f70e9db57f4ecd8d94efd6e8ecb676d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 16:14:48.948855    6072 start.go:360] acquireMachinesLock for force-systemd-env-257000: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 16:15:28.762065    6072 start.go:364] duration metric: took 39.812933701s to acquireMachinesLock for "force-systemd-env-257000"
	I0831 16:15:28.762107    6072 start.go:93] Provisioning new machine with config: &{Name:force-systemd-env-257000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kubernet
esConfig:{KubernetesVersion:v1.31.0 ClusterName:force-systemd-env-257000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOp
timizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 16:15:28.762156    6072 start.go:125] createHost starting for "" (driver="hyperkit")
	I0831 16:15:28.783459    6072 out.go:235] * Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	I0831 16:15:28.783595    6072 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:15:28.783636    6072 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:15:28.792196    6072 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53678
	I0831 16:15:28.792568    6072 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:15:28.792963    6072 main.go:141] libmachine: Using API Version  1
	I0831 16:15:28.792974    6072 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:15:28.793231    6072 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:15:28.793349    6072 main.go:141] libmachine: (force-systemd-env-257000) Calling .GetMachineName
	I0831 16:15:28.793442    6072 main.go:141] libmachine: (force-systemd-env-257000) Calling .DriverName
	I0831 16:15:28.793550    6072 start.go:159] libmachine.API.Create for "force-systemd-env-257000" (driver="hyperkit")
	I0831 16:15:28.793573    6072 client.go:168] LocalClient.Create starting
	I0831 16:15:28.793606    6072 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem
	I0831 16:15:28.793658    6072 main.go:141] libmachine: Decoding PEM data...
	I0831 16:15:28.793677    6072 main.go:141] libmachine: Parsing certificate...
	I0831 16:15:28.793737    6072 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem
	I0831 16:15:28.793775    6072 main.go:141] libmachine: Decoding PEM data...
	I0831 16:15:28.793783    6072 main.go:141] libmachine: Parsing certificate...
	I0831 16:15:28.793800    6072 main.go:141] libmachine: Running pre-create checks...
	I0831 16:15:28.793809    6072 main.go:141] libmachine: (force-systemd-env-257000) Calling .PreCreateCheck
	I0831 16:15:28.793880    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:28.794032    6072 main.go:141] libmachine: (force-systemd-env-257000) Calling .GetConfigRaw
	I0831 16:15:28.862366    6072 main.go:141] libmachine: Creating machine...
	I0831 16:15:28.862376    6072 main.go:141] libmachine: (force-systemd-env-257000) Calling .Create
	I0831 16:15:28.862469    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:28.862618    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | I0831 16:15:28.862470    6095 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 16:15:28.862652    6072 main.go:141] libmachine: (force-systemd-env-257000) Downloading /Users/jenkins/minikube-integration/18943-957/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso...
	I0831 16:15:29.070641    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | I0831 16:15:29.070546    6095 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/id_rsa...
	I0831 16:15:29.225466    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | I0831 16:15:29.225374    6095 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/force-systemd-env-257000.rawdisk...
	I0831 16:15:29.225482    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Writing magic tar header
	I0831 16:15:29.225494    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Writing SSH key tar header
	I0831 16:15:29.226069    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | I0831 16:15:29.226023    6095 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000 ...
	I0831 16:15:29.588766    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:29.588786    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/hyperkit.pid
	I0831 16:15:29.588796    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Using UUID 51df484f-9ab9-4aaf-8947-d31e2237f1bc
	I0831 16:15:29.613284    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Generated MAC 7a:78:9c:d7:c8:b5
	I0831 16:15:29.613303    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-env-257000
	I0831 16:15:29.613333    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:29 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"51df484f-9ab9-4aaf-8947-d31e2237f1bc", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00051e1b0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(
nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 16:15:29.613375    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:29 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"51df484f-9ab9-4aaf-8947-d31e2237f1bc", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00051e1b0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(
nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 16:15:29.613454    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:29 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/hyperkit.pid", "-c", "2", "-m", "2048M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "51df484f-9ab9-4aaf-8947-d31e2237f1bc", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/force-systemd-env-257000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-e
nv-257000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-env-257000"}
	I0831 16:15:29.613507    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:29 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/hyperkit.pid -c 2 -m 2048M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 51df484f-9ab9-4aaf-8947-d31e2237f1bc -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/force-systemd-env-257000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/bzimage,/Users/jenkins/minikube-integration/18943-95
7/.minikube/machines/force-systemd-env-257000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-env-257000"
	I0831 16:15:29.613527    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:29 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 16:15:29.616345    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:29 DEBUG: hyperkit: Pid is 6096
	I0831 16:15:29.616814    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 0
	I0831 16:15:29.616843    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:29.616912    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:15:29.617824    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:15:29.617888    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:29.617902    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:29.617917    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:29.617927    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:29.617938    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:29.617945    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:29.617952    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:29.617958    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:29.617988    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:29.618015    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:29.618041    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:29.618075    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:29.618092    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:29.618109    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:29.618122    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:29.618137    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:29.618161    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:29.618174    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:29.623973    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:29 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 16:15:29.632009    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:29 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 16:15:29.632858    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:15:29.632886    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:15:29.632901    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:15:29.632913    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:15:30.009158    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:30 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 16:15:30.009176    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:30 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 16:15:30.123739    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:30 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:15:30.123760    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:30 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:15:30.123774    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:30 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:15:30.123797    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:30 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:15:30.124665    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:30 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 16:15:30.124678    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:30 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 16:15:31.618886    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 1
	I0831 16:15:31.618903    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:31.619000    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:15:31.619800    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:15:31.619843    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:31.619868    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:31.619879    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:31.619888    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:31.619895    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:31.619914    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:31.619924    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:31.619932    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:31.619938    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:31.619944    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:31.619951    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:31.619961    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:31.619971    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:31.619979    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:31.619990    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:31.620000    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:31.620012    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:31.620025    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:33.622096    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 2
	I0831 16:15:33.622114    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:33.622159    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:15:33.623095    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:15:33.623147    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:33.623158    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:33.623167    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:33.623173    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:33.623187    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:33.623195    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:33.623201    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:33.623207    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:33.623222    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:33.623230    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:33.623237    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:33.623243    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:33.623251    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:33.623258    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:33.623265    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:33.623274    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:33.623281    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:33.623289    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:35.505166    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:35 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0831 16:15:35.505349    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:35 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0831 16:15:35.505362    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:35 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0831 16:15:35.526697    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:15:35 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0831 16:15:35.625450    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 3
	I0831 16:15:35.625479    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:35.625656    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:15:35.627258    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:15:35.627359    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:35.627392    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:35.627402    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:35.627411    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:35.627423    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:35.627438    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:35.627477    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:35.627498    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:35.627517    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:35.627528    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:35.627541    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:35.627554    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:35.627565    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:35.627586    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:35.627597    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:35.627608    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:35.627618    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:35.627627    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:37.628071    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 4
	I0831 16:15:37.628087    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:37.628152    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:15:37.628979    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:15:37.629035    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:37.629046    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:37.629058    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:37.629065    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:37.629072    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:37.629080    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:37.629089    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:37.629097    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:37.629119    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:37.629128    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:37.629136    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:37.629144    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:37.629152    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:37.629160    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:37.629168    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:37.629174    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:37.629187    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:37.629199    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:39.631218    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 5
	I0831 16:15:39.631234    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:39.631299    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:15:39.632214    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:15:39.632252    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:39.632267    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:39.632292    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:39.632300    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:39.632307    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:39.632313    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:39.632320    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:39.632326    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:39.632332    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:39.632344    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:39.632354    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:39.632360    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:39.632372    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:39.632388    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:39.632400    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:39.632416    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:39.632423    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:39.632431    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:41.632820    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 6
	I0831 16:15:41.632833    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:41.632887    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:15:41.633686    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:15:41.633740    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:41.633752    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:41.633760    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:41.633769    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:41.633779    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:41.633799    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:41.633810    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:41.633818    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:41.633825    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:41.633831    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:41.633837    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:41.633848    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:41.633857    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:41.633865    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:41.633873    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:41.633887    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:41.633898    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:41.633908    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:43.633964    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 7
	I0831 16:15:43.633977    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:43.634042    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:15:43.634836    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:15:43.634868    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:43.634877    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:43.634887    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:43.634895    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:43.634902    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:43.634908    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:43.634921    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:43.634934    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:43.634941    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:43.634950    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:43.634966    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:43.634979    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:43.634988    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:43.634996    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:43.635011    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:43.635023    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:43.635038    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:43.635046    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:45.635217    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 8
	I0831 16:15:45.635239    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:45.635332    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:15:45.636126    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:15:45.636177    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:45.636200    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:45.636224    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:45.636238    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:45.636253    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:45.636275    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:45.636284    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:45.636292    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:45.636304    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:45.636313    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:45.636320    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:45.636328    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:45.636337    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:45.636344    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:45.636352    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:45.636363    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:45.636370    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:45.636381    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:47.637164    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 9
	I0831 16:15:47.637180    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:47.637240    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:15:47.638274    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:15:47.638326    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:47.638334    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:47.638355    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:47.638367    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:47.638377    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:47.638384    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:47.638391    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:47.638398    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:47.638406    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:47.638414    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:47.638421    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:47.638429    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:47.638436    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:47.638444    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:47.638474    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:47.638486    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:47.638494    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:47.638502    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:49.638594    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 10
	I0831 16:15:49.638605    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:49.638662    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:15:49.639456    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:15:49.639509    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:49.639522    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:49.639531    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:49.639542    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:49.639550    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:49.639560    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:49.639569    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:49.639577    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:49.639591    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:49.639600    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:49.639607    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:49.639615    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:49.639623    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:49.639630    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:49.639638    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:49.639648    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:49.639656    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:49.639662    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:51.641363    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 11
	I0831 16:15:51.641395    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:51.641445    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:15:51.642250    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:15:51.642295    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:51.642310    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:51.642348    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:51.642361    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:51.642381    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:51.642391    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:51.642410    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:51.642427    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:51.642437    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:51.642445    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:51.642454    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:51.642461    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:51.642475    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:51.642482    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:51.642490    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:51.642499    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:51.642506    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:51.642522    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:53.642503    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 12
	I0831 16:15:53.642518    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:53.642609    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:15:53.643405    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:15:53.643430    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:53.643439    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:53.643449    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:53.643456    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:53.643463    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:53.643470    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:53.643476    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:53.643492    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:53.643502    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:53.643512    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:53.643521    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:53.643529    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:53.643535    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:53.643541    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:53.643548    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:53.643557    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:53.643564    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:53.643571    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:55.645632    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 13
	I0831 16:15:55.645649    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:55.645738    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:15:55.646523    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:15:55.646591    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:55.646604    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:55.646630    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:55.646639    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:55.646652    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:55.646662    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:55.646668    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:55.646674    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:55.646694    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:55.646705    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:55.646713    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:55.646726    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:55.646743    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:55.646760    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:55.646771    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:55.646777    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:55.646784    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:55.646794    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:57.647223    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 14
	I0831 16:15:57.647236    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:57.647295    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:15:57.648107    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:15:57.648162    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:57.648173    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:57.648191    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:57.648197    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:57.648209    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:57.648217    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:57.648225    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:57.648232    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:57.648239    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:57.648247    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:57.648261    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:57.648274    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:57.648284    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:57.648291    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:57.648305    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:57.648317    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:57.648331    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:57.648345    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:15:59.649408    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 15
	I0831 16:15:59.649420    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:15:59.649505    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:15:59.650371    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:15:59.650407    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:15:59.650416    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:15:59.650437    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:15:59.650457    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:15:59.650467    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:15:59.650478    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:15:59.650486    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:15:59.650494    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:15:59.650501    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:15:59.650510    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:15:59.650517    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:15:59.650525    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:15:59.650541    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:15:59.650553    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:15:59.650565    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:15:59.650574    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:15:59.650581    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:15:59.650590    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:01.652636    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 16
	I0831 16:16:01.652649    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:01.652710    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:16:01.653620    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:16:01.653666    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:01.653674    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:01.653684    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:01.653693    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:01.653701    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:01.653709    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:01.653716    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:01.653724    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:01.653741    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:01.653755    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:01.653773    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:01.653786    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:01.653803    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:01.653814    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:01.653821    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:01.653832    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:01.653841    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:01.653846    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:03.654729    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 17
	I0831 16:16:03.654744    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:03.654800    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:16:03.655624    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:16:03.655642    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:03.655670    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:03.655682    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:03.655689    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:03.655696    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:03.655703    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:03.655709    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:03.655718    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:03.655733    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:03.655749    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:03.655762    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:03.655772    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:03.655780    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:03.655797    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:03.655805    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:03.655812    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:03.655832    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:03.655844    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:05.656256    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 18
	I0831 16:16:05.656267    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:05.656340    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:16:05.657198    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:16:05.657259    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:05.657270    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:05.657277    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:05.657284    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:05.657292    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:05.657298    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:05.657305    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:05.657313    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:05.657320    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:05.657336    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:05.657352    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:05.657365    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:05.657373    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:05.657383    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:05.657389    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:05.657397    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:05.657405    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:05.657411    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:07.657637    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 19
	I0831 16:16:07.657650    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:07.657718    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:16:07.658507    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:16:07.658570    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:07.658586    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:07.658599    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:07.658608    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:07.658616    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:07.658625    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:07.658634    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:07.658640    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:07.658661    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:07.658673    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:07.658682    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:07.658691    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:07.658698    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:07.658706    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:07.658713    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:07.658721    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:07.658729    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:07.658740    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:09.660741    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 20
	I0831 16:16:09.660755    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:09.660857    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:16:09.661631    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:16:09.661676    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:09.661686    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:09.661695    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:09.661704    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:09.661710    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:09.661718    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:09.661725    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:09.661737    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:09.661744    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:09.661751    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:09.661759    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:09.661767    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:09.661781    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:09.661791    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:09.661801    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:09.661809    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:09.661816    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:09.661826    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:11.663879    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 21
	I0831 16:16:11.663907    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:11.663946    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:16:11.664832    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:16:11.664871    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:11.664881    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:11.664906    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:11.664926    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:11.664936    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:11.664944    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:11.664954    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:11.664962    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:11.664969    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:11.664980    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:11.664994    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:11.665007    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:11.665015    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:11.665027    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:11.665040    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:11.665052    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:11.665066    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:11.665080    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:13.665956    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 22
	I0831 16:16:13.665972    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:13.666039    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:16:13.666840    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:16:13.666881    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:13.666891    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:13.666901    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:13.666912    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:13.666919    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:13.666925    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:13.666943    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:13.666953    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:13.666969    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:13.666981    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:13.667008    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:13.667019    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:13.667032    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:13.667041    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:13.667050    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:13.667058    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:13.667066    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:13.667074    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:15.668011    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 23
	I0831 16:16:15.668023    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:15.668092    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:16:15.668884    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:16:15.668937    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:15.668949    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:15.668974    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:15.668987    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:15.669003    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:15.669013    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:15.669021    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:15.669030    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:15.669038    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:15.669046    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:15.669054    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:15.669062    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:15.669074    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:15.669082    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:15.669089    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:15.669095    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:15.669101    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:15.669111    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:17.671115    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 24
	I0831 16:16:17.671129    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:17.671198    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:16:17.672217    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:16:17.672233    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:17.672244    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:17.672253    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:17.672260    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:17.672273    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:17.672283    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:17.672290    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:17.672299    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:17.672308    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:17.672318    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:17.672332    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:17.672345    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:17.672353    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:17.672361    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:17.672369    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:17.672377    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:17.672384    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:17.672392    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:19.674431    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 25
	I0831 16:16:19.674445    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:19.674518    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:16:19.675306    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:16:19.675355    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:19.675367    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:19.675390    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:19.675406    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:19.675417    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:19.675424    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:19.675432    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:19.675438    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:19.675444    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:19.675451    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:19.675459    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:19.675475    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:19.675487    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:19.675503    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:19.675517    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:19.675526    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:19.675536    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:19.675545    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:21.677559    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 26
	I0831 16:16:21.677583    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:21.677627    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:16:21.678467    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:16:21.678521    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:21.678537    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:21.678546    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:21.678554    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:21.678587    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:21.678596    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:21.678606    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:21.678619    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:21.678635    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:21.678649    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:21.678657    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:21.678665    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:21.678674    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:21.678682    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:21.678691    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:21.678699    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:21.678706    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:21.678712    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:23.680693    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 27
	I0831 16:16:23.680708    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:23.680834    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:16:23.681687    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:16:23.681735    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:23.681752    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:23.681765    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:23.681772    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:23.681779    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:23.681788    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:23.681795    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:23.681803    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:23.681810    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:23.681830    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:23.681840    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:23.681850    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:23.681857    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:23.681865    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:23.681873    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:23.681881    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:23.681898    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:23.681910    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:25.683065    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 28
	I0831 16:16:25.683080    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:25.683191    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:16:25.683985    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:16:25.684019    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:25.684027    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:25.684035    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:25.684042    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:25.684049    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:25.684055    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:25.684075    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:25.684085    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:25.684092    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:25.684098    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:25.684108    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:25.684115    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:25.684125    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:25.684133    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:25.684141    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:25.684147    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:25.684154    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:25.684162    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:27.684796    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 29
	I0831 16:16:27.684814    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:27.684859    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:16:27.685899    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for 7a:78:9c:d7:c8:b5 in /var/db/dhcpd_leases ...
	I0831 16:16:27.685946    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:16:27.685960    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:16:27.685977    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:16:27.685986    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:16:27.685993    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:16:27.686000    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:16:27.686006    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:16:27.686014    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:16:27.686031    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:16:27.686056    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:16:27.686068    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:16:27.686078    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:16:27.686088    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:16:27.686099    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:16:27.686108    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:16:27.686114    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:16:27.686122    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:16:27.686130    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:16:29.688223    6072 client.go:171] duration metric: took 1m0.894237795s to LocalClient.Create
	I0831 16:16:31.689450    6072 start.go:128] duration metric: took 1m2.926855924s to createHost
	I0831 16:16:31.689468    6072 start.go:83] releasing machines lock for "force-systemd-env-257000", held for 1m2.926974706s
	W0831 16:16:31.689525    6072 start.go:714] error starting host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 7a:78:9c:d7:c8:b5
	I0831 16:16:31.689848    6072 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:16:31.689871    6072 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:16:31.698656    6072 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53680
	I0831 16:16:31.699120    6072 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:16:31.699606    6072 main.go:141] libmachine: Using API Version  1
	I0831 16:16:31.699623    6072 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:16:31.699946    6072 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:16:31.700316    6072 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:16:31.700363    6072 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:16:31.709165    6072 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53682
	I0831 16:16:31.709532    6072 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:16:31.709989    6072 main.go:141] libmachine: Using API Version  1
	I0831 16:16:31.709998    6072 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:16:31.710291    6072 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:16:31.710419    6072 main.go:141] libmachine: (force-systemd-env-257000) Calling .GetState
	I0831 16:16:31.710511    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:31.710584    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:16:31.711549    6072 main.go:141] libmachine: (force-systemd-env-257000) Calling .DriverName
	I0831 16:16:31.732989    6072 out.go:177] * Deleting "force-systemd-env-257000" in hyperkit ...
	I0831 16:16:31.774847    6072 main.go:141] libmachine: (force-systemd-env-257000) Calling .Remove
	I0831 16:16:31.775035    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:31.775045    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:31.775116    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:16:31.776050    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:31.776107    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | waiting for graceful shutdown
	I0831 16:16:32.776305    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:32.776369    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:16:32.777317    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | waiting for graceful shutdown
	I0831 16:16:33.779471    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:33.779538    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:16:33.781242    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | waiting for graceful shutdown
	I0831 16:16:34.782404    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:34.782489    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:16:34.783326    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | waiting for graceful shutdown
	I0831 16:16:35.784865    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:35.784973    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:16:35.785527    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | waiting for graceful shutdown
	I0831 16:16:36.786122    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:36.786210    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6096
	I0831 16:16:36.787129    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | sending sigkill
	I0831 16:16:36.787139    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:16:36.797917    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:16:36 WARN : hyperkit: failed to read stdout: EOF
	I0831 16:16:36.797948    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:16:36 WARN : hyperkit: failed to read stderr: EOF
	W0831 16:16:36.818501    6072 out.go:270] ! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 7a:78:9c:d7:c8:b5
	! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 7a:78:9c:d7:c8:b5
	I0831 16:16:36.818519    6072 start.go:729] Will try again in 5 seconds ...
	I0831 16:16:41.818741    6072 start.go:360] acquireMachinesLock for force-systemd-env-257000: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 16:17:34.471429    6072 start.go:364] duration metric: took 52.652301398s to acquireMachinesLock for "force-systemd-env-257000"
	I0831 16:17:34.471463    6072 start.go:93] Provisioning new machine with config: &{Name:force-systemd-env-257000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kubernet
esConfig:{KubernetesVersion:v1.31.0 ClusterName:force-systemd-env-257000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOp
timizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 16:17:34.471519    6072 start.go:125] createHost starting for "" (driver="hyperkit")
	I0831 16:17:34.513819    6072 out.go:235] * Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	I0831 16:17:34.513887    6072 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:17:34.513910    6072 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:17:34.522863    6072 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53686
	I0831 16:17:34.523314    6072 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:17:34.523756    6072 main.go:141] libmachine: Using API Version  1
	I0831 16:17:34.523769    6072 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:17:34.524150    6072 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:17:34.524310    6072 main.go:141] libmachine: (force-systemd-env-257000) Calling .GetMachineName
	I0831 16:17:34.524411    6072 main.go:141] libmachine: (force-systemd-env-257000) Calling .DriverName
	I0831 16:17:34.524528    6072 start.go:159] libmachine.API.Create for "force-systemd-env-257000" (driver="hyperkit")
	I0831 16:17:34.524545    6072 client.go:168] LocalClient.Create starting
	I0831 16:17:34.524570    6072 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem
	I0831 16:17:34.524624    6072 main.go:141] libmachine: Decoding PEM data...
	I0831 16:17:34.524638    6072 main.go:141] libmachine: Parsing certificate...
	I0831 16:17:34.524677    6072 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem
	I0831 16:17:34.524714    6072 main.go:141] libmachine: Decoding PEM data...
	I0831 16:17:34.524726    6072 main.go:141] libmachine: Parsing certificate...
	I0831 16:17:34.524739    6072 main.go:141] libmachine: Running pre-create checks...
	I0831 16:17:34.524744    6072 main.go:141] libmachine: (force-systemd-env-257000) Calling .PreCreateCheck
	I0831 16:17:34.524821    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:34.524856    6072 main.go:141] libmachine: (force-systemd-env-257000) Calling .GetConfigRaw
	I0831 16:17:34.535253    6072 main.go:141] libmachine: Creating machine...
	I0831 16:17:34.535262    6072 main.go:141] libmachine: (force-systemd-env-257000) Calling .Create
	I0831 16:17:34.535347    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:34.535509    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | I0831 16:17:34.535340    6139 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 16:17:34.535560    6072 main.go:141] libmachine: (force-systemd-env-257000) Downloading /Users/jenkins/minikube-integration/18943-957/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso...
	I0831 16:17:34.861399    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | I0831 16:17:34.861325    6139 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/id_rsa...
	I0831 16:17:35.003267    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | I0831 16:17:35.003207    6139 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/force-systemd-env-257000.rawdisk...
	I0831 16:17:35.003282    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Writing magic tar header
	I0831 16:17:35.003303    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Writing SSH key tar header
	I0831 16:17:35.003599    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | I0831 16:17:35.003569    6139 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000 ...
	I0831 16:17:35.367240    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:35.367262    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/hyperkit.pid
	I0831 16:17:35.367320    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Using UUID c74a9d00-dce4-463a-a058-1e1edff892c1
	I0831 16:17:35.392464    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Generated MAC ea:cd:5:24:60:3
	I0831 16:17:35.392486    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-env-257000
	I0831 16:17:35.392546    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:35 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"c74a9d00-dce4-463a-a058-1e1edff892c1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0000ba1b0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(
nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 16:17:35.392590    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:35 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"c74a9d00-dce4-463a-a058-1e1edff892c1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0000ba1b0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(
nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 16:17:35.392650    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:35 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/hyperkit.pid", "-c", "2", "-m", "2048M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "c74a9d00-dce4-463a-a058-1e1edff892c1", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/force-systemd-env-257000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-e
nv-257000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-env-257000"}
	I0831 16:17:35.392696    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:35 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/hyperkit.pid -c 2 -m 2048M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U c74a9d00-dce4-463a-a058-1e1edff892c1 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/force-systemd-env-257000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/bzimage,/Users/jenkins/minikube-integration/18943-95
7/.minikube/machines/force-systemd-env-257000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-env-257000"
	I0831 16:17:35.392707    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:35 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 16:17:35.395642    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:35 DEBUG: hyperkit: Pid is 6149
	I0831 16:17:35.396143    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 0
	I0831 16:17:35.396158    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:35.396248    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:17:35.397285    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:17:35.397322    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:35.397368    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:35.397395    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:35.397422    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:35.397469    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:35.397488    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:35.397499    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:35.397508    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:35.397515    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:35.397529    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:35.397539    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:35.397548    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:35.397557    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:35.397567    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:35.397575    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:35.397582    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:35.397609    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:35.397621    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:35.403464    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:35 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 16:17:35.411411    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:35 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/force-systemd-env-257000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 16:17:35.412319    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:35 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:17:35.412342    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:35 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:17:35.412372    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:35 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:17:35.412393    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:35 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:17:35.789976    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:35 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 16:17:35.789992    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:35 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 16:17:35.904894    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:35 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:17:35.904912    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:35 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:17:35.904924    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:35 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:17:35.904944    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:35 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:17:35.905787    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:35 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 16:17:35.905799    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:35 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 16:17:37.398122    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 1
	I0831 16:17:37.398139    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:37.398221    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:17:37.399013    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:17:37.399069    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:37.399079    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:37.399087    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:37.399095    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:37.399115    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:37.399130    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:37.399137    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:37.399169    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:37.399182    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:37.399189    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:37.399198    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:37.399204    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:37.399213    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:37.399220    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:37.399226    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:37.399233    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:37.399240    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:37.399245    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:39.400487    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 2
	I0831 16:17:39.400501    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:39.400641    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:17:39.401534    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:17:39.401577    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:39.401588    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:39.401606    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:39.401622    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:39.401633    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:39.401639    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:39.401646    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:39.401662    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:39.401673    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:39.401680    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:39.401687    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:39.401693    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:39.401701    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:39.401708    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:39.401716    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:39.401723    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:39.401731    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:39.401740    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:41.279615    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:41 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0831 16:17:41.279756    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:41 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0831 16:17:41.279766    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:41 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0831 16:17:41.299970    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | 2024/08/31 16:17:41 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0831 16:17:41.403984    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 3
	I0831 16:17:41.404011    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:41.404198    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:17:41.405630    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:17:41.405749    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:41.405770    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:41.405805    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:41.405823    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:41.405849    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:41.405869    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:41.405892    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:41.405925    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:41.405936    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:41.405949    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:41.405966    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:41.405978    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:41.405988    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:41.405996    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:41.406005    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:41.406016    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:41.406026    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:41.406037    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:43.406756    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 4
	I0831 16:17:43.406771    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:43.406841    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:17:43.407653    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:17:43.407707    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:43.407724    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:43.407744    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:43.407767    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:43.407785    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:43.407798    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:43.407807    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:43.407814    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:43.407821    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:43.407828    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:43.407835    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:43.407842    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:43.407850    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:43.407858    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:43.407866    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:43.407880    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:43.407892    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:43.407901    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:45.409887    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 5
	I0831 16:17:45.409902    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:45.409968    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:17:45.410957    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:17:45.411009    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:45.411019    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:45.411040    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:45.411058    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:45.411071    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:45.411084    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:45.411093    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:45.411102    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:45.411123    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:45.411133    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:45.411139    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:45.411147    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:45.411154    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:45.411162    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:45.411181    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:45.411198    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:45.411214    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:45.411227    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:47.413243    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 6
	I0831 16:17:47.413254    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:47.413321    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:17:47.414287    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:17:47.414331    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:47.414345    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:47.414360    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:47.414367    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:47.414377    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:47.414403    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:47.414415    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:47.414425    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:47.414433    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:47.414441    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:47.414454    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:47.414462    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:47.414472    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:47.414481    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:47.414496    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:47.414508    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:47.414517    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:47.414525    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:49.414478    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 7
	I0831 16:17:49.414490    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:49.414574    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:17:49.415404    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:17:49.415452    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:49.415464    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:49.415476    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:49.415483    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:49.415499    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:49.415511    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:49.415519    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:49.415525    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:49.415532    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:49.415549    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:49.415558    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:49.415567    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:49.415575    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:49.415582    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:49.415590    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:49.415597    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:49.415605    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:49.415613    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:51.417742    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 8
	I0831 16:17:51.417754    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:51.417838    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:17:51.418663    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:17:51.418700    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:51.418718    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:51.418734    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:51.418745    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:51.418780    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:51.418793    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:51.418800    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:51.418807    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:51.418817    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:51.418824    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:51.418831    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:51.418839    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:51.418847    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:51.418854    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:51.418860    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:51.418871    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:51.418881    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:51.418890    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:53.419422    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 9
	I0831 16:17:53.419436    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:53.419488    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:17:53.420343    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:17:53.420369    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:53.420381    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:53.420391    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:53.420397    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:53.420403    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:53.420411    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:53.420427    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:53.420436    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:53.420450    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:53.420463    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:53.420476    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:53.420490    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:53.420498    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:53.420510    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:53.420519    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:53.420526    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:53.420543    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:53.420553    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:55.422567    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 10
	I0831 16:17:55.422582    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:55.422638    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:17:55.423469    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:17:55.423518    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:55.423534    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:55.423570    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:55.423584    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:55.423592    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:55.423599    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:55.423609    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:55.423615    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:55.423622    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:55.423630    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:55.423645    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:55.423660    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:55.423678    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:55.423689    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:55.423700    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:55.423709    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:55.423717    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:55.423725    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:57.425738    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 11
	I0831 16:17:57.425754    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:57.425819    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:17:57.426667    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:17:57.426717    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:57.426729    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:57.426740    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:57.426748    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:57.426756    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:57.426763    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:57.426770    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:57.426777    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:57.426786    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:57.426793    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:57.426801    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:57.426808    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:57.426816    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:57.426830    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:57.426839    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:57.426851    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:57.426863    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:57.426886    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:17:59.428105    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 12
	I0831 16:17:59.428117    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:17:59.428179    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:17:59.429026    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:17:59.429078    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:17:59.429097    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:17:59.429115    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:17:59.429126    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:17:59.429136    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:17:59.429145    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:17:59.429153    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:17:59.429159    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:17:59.429166    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:17:59.429172    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:17:59.429178    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:17:59.429185    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:17:59.429193    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:17:59.429211    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:17:59.429223    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:17:59.429242    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:17:59.429251    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:17:59.429262    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:01.429996    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 13
	I0831 16:18:01.430009    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:01.430077    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:18:01.430907    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:18:01.430965    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:01.430976    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:01.430985    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:01.430991    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:01.431004    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:01.431034    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:01.431044    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:01.431053    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:01.431061    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:01.431067    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:01.431080    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:01.431093    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:01.431102    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:01.431110    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:01.431124    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:01.431139    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:01.431151    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:01.431162    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:03.433116    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 14
	I0831 16:18:03.433127    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:03.433201    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:18:03.434029    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:18:03.434055    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:03.434065    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:03.434075    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:03.434080    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:03.434110    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:03.434120    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:03.434127    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:03.434134    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:03.434147    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:03.434159    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:03.434167    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:03.434175    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:03.434184    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:03.434193    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:03.434211    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:03.434219    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:03.434227    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:03.434232    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:05.436194    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 15
	I0831 16:18:05.436223    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:05.436303    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:18:05.437152    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:18:05.437188    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:05.437199    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:05.437229    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:05.437240    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:05.437247    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:05.437257    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:05.437264    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:05.437273    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:05.437280    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:05.437286    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:05.437292    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:05.437302    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:05.437311    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:05.437320    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:05.437328    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:05.437341    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:05.437357    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:05.437367    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:07.438733    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 16
	I0831 16:18:07.438749    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:07.438845    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:18:07.439648    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:18:07.439719    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:07.439728    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:07.439736    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:07.439749    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:07.439756    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:07.439763    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:07.439789    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:07.439801    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:07.439810    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:07.439820    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:07.439832    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:07.439845    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:07.439865    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:07.439878    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:07.439899    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:07.439913    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:07.439921    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:07.439929    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:09.441020    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 17
	I0831 16:18:09.441033    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:09.441086    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:18:09.441869    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:18:09.441905    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:09.441915    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:09.441929    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:09.441951    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:09.441959    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:09.441966    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:09.441973    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:09.441980    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:09.441986    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:09.442002    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:09.442014    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:09.442025    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:09.442044    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:09.442058    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:09.442071    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:09.442079    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:09.442085    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:09.442092    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:11.443347    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 18
	I0831 16:18:11.443362    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:11.443410    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:18:11.444230    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:18:11.444257    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:11.444266    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:11.444277    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:11.444309    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:11.444320    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:11.444328    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:11.444343    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:11.444360    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:11.444374    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:11.444383    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:11.444391    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:11.444400    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:11.444408    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:11.444424    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:11.444437    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:11.444447    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:11.444460    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:11.444473    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:13.445552    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 19
	I0831 16:18:13.445566    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:13.445596    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:18:13.446409    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:18:13.446444    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:13.446452    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:13.446463    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:13.446470    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:13.446493    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:13.446505    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:13.446523    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:13.446536    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:13.446554    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:13.446562    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:13.446587    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:13.446619    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:13.446626    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:13.446633    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:13.446640    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:13.446648    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:13.446663    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:13.446677    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:15.448156    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 20
	I0831 16:18:15.448172    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:15.448283    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:18:15.449068    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:18:15.449116    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:15.449127    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:15.449139    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:15.449148    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:15.449155    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:15.449170    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:15.449178    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:15.449186    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:15.449197    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:15.449206    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:15.449221    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:15.449233    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:15.449242    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:15.449250    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:15.449259    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:15.449266    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:15.449272    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:15.449277    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:17.450811    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 21
	I0831 16:18:17.450823    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:17.450896    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:18:17.451661    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:18:17.451727    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:17.451744    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:17.451756    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:17.451766    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:17.451786    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:17.451799    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:17.451808    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:17.451816    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:17.451823    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:17.451831    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:17.451838    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:17.451848    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:17.451874    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:17.451886    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:17.451893    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:17.451905    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:17.451913    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:17.451921    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:19.453872    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 22
	I0831 16:18:19.453885    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:19.453948    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:18:19.454748    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:18:19.454795    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:19.454803    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:19.454821    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:19.454832    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:19.454840    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:19.454847    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:19.454853    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:19.454860    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:19.454866    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:19.454874    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:19.454886    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:19.454893    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:19.454899    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:19.454908    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:19.454916    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:19.454925    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:19.454932    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:19.454940    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:21.455116    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 23
	I0831 16:18:21.455127    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:21.455210    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:18:21.455991    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:18:21.456035    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:21.456049    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:21.456064    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:21.456078    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:21.456087    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:21.456095    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:21.456107    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:21.456118    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:21.456125    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:21.456140    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:21.456156    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:21.456169    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:21.456179    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:21.456187    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:21.456195    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:21.456204    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:21.456221    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:21.456237    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:23.457569    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 24
	I0831 16:18:23.457580    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:23.457639    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:18:23.458668    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:18:23.458740    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:23.458753    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:23.458763    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:23.458770    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:23.458777    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:23.458783    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:23.458798    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:23.458819    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:23.458826    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:23.458834    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:23.458841    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:23.458854    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:23.458862    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:23.458868    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:23.458888    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:23.458902    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:23.458910    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:23.458919    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:25.460923    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 25
	I0831 16:18:25.460934    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:25.461016    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:18:25.461828    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:18:25.461870    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:25.461891    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:25.461903    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:25.461911    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:25.461919    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:25.461926    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:25.461950    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:25.461963    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:25.461972    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:25.461979    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:25.461987    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:25.461994    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:25.462011    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:25.462023    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:25.462033    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:25.462042    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:25.462049    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:25.462057    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:27.464098    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 26
	I0831 16:18:27.464108    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:27.464195    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:18:27.464953    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:18:27.464992    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:27.465015    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:27.465025    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:27.465032    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:27.465039    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:27.465045    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:27.465052    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:27.465059    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:27.465065    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:27.465071    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:27.465080    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:27.465088    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:27.465101    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:27.465109    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:27.465118    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:27.465126    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:27.465133    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:27.465141    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:29.466114    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 27
	I0831 16:18:29.466127    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:29.466179    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:18:29.466949    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:18:29.467004    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:29.467014    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:29.467029    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:29.467036    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:29.467065    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:29.467076    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:29.467084    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:29.467091    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:29.467097    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:29.467108    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:29.467119    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:29.467127    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:29.467135    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:29.467142    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:29.467149    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:29.467155    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:29.467169    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:29.467188    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:31.469235    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 28
	I0831 16:18:31.469255    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:31.469338    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:18:31.470134    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:18:31.470186    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:31.470202    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:31.470213    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:31.470221    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:31.470230    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:31.470249    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:31.470267    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:31.470279    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:31.470288    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:31.470296    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:31.470304    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:31.470321    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:31.470334    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:31.470343    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:31.470353    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:31.470362    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:31.470369    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:31.470377    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:33.471094    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Attempt 29
	I0831 16:18:33.471106    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:18:33.471203    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | hyperkit pid from json: 6149
	I0831 16:18:33.471990    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Searching for ea:cd:5:24:60:3 in /var/db/dhcpd_leases ...
	I0831 16:18:33.472032    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0831 16:18:33.472045    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:ce:1c:1e:40:9:54 ID:1,ce:1c:1e:40:9:54 Lease:0x66d4f4f0}
	I0831 16:18:33.472060    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:76:9b:10:4b:2f:9c ID:1,76:9b:10:4b:2f:9c Lease:0x66d4f466}
	I0831 16:18:33.472069    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:ce:70:4d:b2:3c:b ID:1,ce:70:4d:b2:3c:b Lease:0x66d4f3d1}
	I0831 16:18:33.472083    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a280}
	I0831 16:18:33.472097    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:18:33.472115    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:18:33.472127    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:86:1d:5b:46:48:c6 ID:1,86:1d:5b:46:48:c6 Lease:0x66d4f05f}
	I0831 16:18:33.472136    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:3e:1c:67:cb:cf:85 ID:1,3e:1c:67:cb:cf:85 Lease:0x66d4f009}
	I0831 16:18:33.472142    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:82:ca:b:26:7a:a9 ID:1,82:ca:b:26:7a:a9 Lease:0x66d4efdc}
	I0831 16:18:33.472149    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:aa:71:9c:2d:6f:19 ID:1,aa:71:9c:2d:6f:19 Lease:0x66d4ef73}
	I0831 16:18:33.472158    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39e49}
	I0831 16:18:33.472170    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 16:18:33.472178    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 16:18:33.472187    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 16:18:33.472195    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 16:18:33.472210    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 16:18:33.472223    6072 main.go:141] libmachine: (force-systemd-env-257000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 16:18:35.474383    6072 client.go:171] duration metric: took 1m0.949429718s to LocalClient.Create
	I0831 16:18:37.476549    6072 start.go:128] duration metric: took 1m3.004584134s to createHost
	I0831 16:18:37.476561    6072 start.go:83] releasing machines lock for "force-systemd-env-257000", held for 1m3.004701512s
	W0831 16:18:37.476621    6072 out.go:270] * Failed to start hyperkit VM. Running "minikube delete -p force-systemd-env-257000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for ea:cd:5:24:60:3
	* Failed to start hyperkit VM. Running "minikube delete -p force-systemd-env-257000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for ea:cd:5:24:60:3
	I0831 16:18:37.539866    6072 out.go:201] 
	W0831 16:18:37.562955    6072 out.go:270] X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for ea:cd:5:24:60:3
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for ea:cd:5:24:60:3
	W0831 16:18:37.562969    6072 out.go:270] * 
	* 
	W0831 16:18:37.563712    6072 out.go:293] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0831 16:18:37.625914    6072 out.go:201] 

                                                
                                                
** /stderr **
docker_test.go:157: failed to start minikube with args: "out/minikube-darwin-amd64 start -p force-systemd-env-257000 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit " : exit status 80
docker_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 -p force-systemd-env-257000 ssh "docker info --format {{.CgroupDriver}}"
docker_test.go:110: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p force-systemd-env-257000 ssh "docker info --format {{.CgroupDriver}}": exit status 50 (176.758068ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to DRV_CP_ENDPOINT: Unable to get control-plane node force-systemd-env-257000 endpoint: failed to lookup ip for ""
	* Suggestion: 
	
	    Recreate the cluster by running:
	    minikube delete <no value>
	    minikube start <no value>

                                                
                                                
** /stderr **
docker_test.go:112: failed to get docker cgroup driver. args "out/minikube-darwin-amd64 -p force-systemd-env-257000 ssh \"docker info --format {{.CgroupDriver}}\"": exit status 50
docker_test.go:166: *** TestForceSystemdEnv FAILED at 2024-08-31 16:18:37.927024 -0700 PDT m=+4415.123885184
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p force-systemd-env-257000 -n force-systemd-env-257000
helpers_test.go:240: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p force-systemd-env-257000 -n force-systemd-env-257000: exit status 7 (79.890213ms)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0831 16:18:38.004937    6175 status.go:352] failed to get driver ip: getting IP: IP address is not set
	E0831 16:18:38.004960    6175 status.go:249] status error: getting IP: IP address is not set

                                                
                                                
** /stderr **
helpers_test.go:240: status error: exit status 7 (may be ok)
helpers_test.go:242: "force-systemd-env-257000" host is not running, skipping log retrieval (state="Error")
helpers_test.go:176: Cleaning up "force-systemd-env-257000" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 delete -p force-systemd-env-257000
helpers_test.go:179: (dbg) Done: out/minikube-darwin-amd64 delete -p force-systemd-env-257000: (5.262779834s)
--- FAIL: TestForceSystemdEnv (234.76s)

                                                
                                    
x
+
TestMultiControlPlane/serial/AddWorkerNode (79.65s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/AddWorkerNode
ha_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 node add -p ha-949000 -v=7 --alsologtostderr
E0831 15:32:52.717411    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:32:52.723842    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:32:52.735039    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:32:52.757438    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:32:52.799976    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:32:52.881416    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:32:53.044322    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:32:53.367480    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:32:54.009436    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:32:55.292981    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:32:57.855082    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:33:02.976561    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:33:13.219187    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:33:33.701257    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
ha_test.go:228: (dbg) Non-zero exit: out/minikube-darwin-amd64 node add -p ha-949000 -v=7 --alsologtostderr: exit status 90 (1m16.553607113s)

                                                
                                                
-- stdout --
	* Adding node m04 to cluster ha-949000 as [worker]
	* Starting "ha-949000-m04" worker node in "ha-949000" cluster
	* Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 15:32:27.402648    3366 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:32:27.403032    3366 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:32:27.403038    3366 out.go:358] Setting ErrFile to fd 2...
	I0831 15:32:27.403042    3366 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:32:27.403205    3366 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:32:27.403523    3366 mustload.go:65] Loading cluster: ha-949000
	I0831 15:32:27.403850    3366 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:32:27.404243    3366 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:32:27.404290    3366 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:32:27.412476    3366 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51220
	I0831 15:32:27.412869    3366 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:32:27.413365    3366 main.go:141] libmachine: Using API Version  1
	I0831 15:32:27.413374    3366 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:32:27.413616    3366 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:32:27.413752    3366 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:32:27.413837    3366 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:32:27.413901    3366 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:32:27.414893    3366 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:32:27.415156    3366 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:32:27.415178    3366 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:32:27.423564    3366 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51222
	I0831 15:32:27.423908    3366 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:32:27.424274    3366 main.go:141] libmachine: Using API Version  1
	I0831 15:32:27.424286    3366 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:32:27.424514    3366 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:32:27.424625    3366 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:32:27.424954    3366 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:32:27.424976    3366 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:32:27.433750    3366 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51224
	I0831 15:32:27.434195    3366 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:32:27.434632    3366 main.go:141] libmachine: Using API Version  1
	I0831 15:32:27.434657    3366 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:32:27.434880    3366 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:32:27.435026    3366 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:32:27.435147    3366 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:32:27.435263    3366 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:32:27.436297    3366 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:32:27.436586    3366 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:32:27.436608    3366 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:32:27.445220    3366 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51226
	I0831 15:32:27.445540    3366 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:32:27.445870    3366 main.go:141] libmachine: Using API Version  1
	I0831 15:32:27.445897    3366 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:32:27.446133    3366 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:32:27.446253    3366 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:32:27.446610    3366 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:32:27.446640    3366 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:32:27.454986    3366 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51228
	I0831 15:32:27.455326    3366 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:32:27.455694    3366 main.go:141] libmachine: Using API Version  1
	I0831 15:32:27.455711    3366 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:32:27.455932    3366 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:32:27.456051    3366 main.go:141] libmachine: (ha-949000-m03) Calling .GetState
	I0831 15:32:27.456132    3366 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:32:27.456212    3366 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:32:27.457175    3366 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:32:27.457441    3366 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:32:27.457466    3366 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:32:27.465984    3366 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51230
	I0831 15:32:27.466352    3366 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:32:27.466701    3366 main.go:141] libmachine: Using API Version  1
	I0831 15:32:27.466721    3366 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:32:27.466915    3366 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:32:27.467011    3366 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:32:27.467115    3366 api_server.go:166] Checking apiserver status ...
	I0831 15:32:27.467172    3366 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:32:27.467191    3366 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:32:27.467302    3366 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:32:27.467389    3366 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:32:27.467468    3366 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:32:27.467554    3366 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:32:27.511232    3366 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup
	W0831 15:32:27.519057    3366 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:32:27.519118    3366 ssh_runner.go:195] Run: ls
	I0831 15:32:27.522349    3366 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0831 15:32:27.526926    3366 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0831 15:32:27.549056    3366 out.go:177] * Adding node m04 to cluster ha-949000 as [worker]
	I0831 15:32:27.570781    3366 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:32:27.570957    3366 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:32:27.609413    3366 out.go:177] * Starting "ha-949000-m04" worker node in "ha-949000" cluster
	I0831 15:32:27.647330    3366 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:32:27.647382    3366 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4
	I0831 15:32:27.647403    3366 cache.go:56] Caching tarball of preloaded images
	I0831 15:32:27.647564    3366 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:32:27.647579    3366 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:32:27.647672    3366 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:32:27.648237    3366 start.go:360] acquireMachinesLock for ha-949000-m04: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:32:27.648318    3366 start.go:364] duration metric: took 61.761µs to acquireMachinesLock for "ha-949000-m04"
	I0831 15:32:27.648342    3366 start.go:93] Provisioning new machine with config: &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP: Port:0 KubernetesVersion:v1.31.0 ContainerRuntime: ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false
freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountG
ID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m04 IP: Port:0 KubernetesVersion:v1.31.0 ContainerRuntime: ControlPlane:false Worker:true}
	I0831 15:32:27.648503    3366 start.go:125] createHost starting for "m04" (driver="hyperkit")
	I0831 15:32:27.669338    3366 out.go:235] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0831 15:32:27.669568    3366 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:32:27.669596    3366 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:32:27.678356    3366 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51234
	I0831 15:32:27.678706    3366 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:32:27.679072    3366 main.go:141] libmachine: Using API Version  1
	I0831 15:32:27.679097    3366 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:32:27.679296    3366 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:32:27.679402    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetMachineName
	I0831 15:32:27.679483    3366 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:32:27.679569    3366 start.go:159] libmachine.API.Create for "ha-949000" (driver="hyperkit")
	I0831 15:32:27.679593    3366 client.go:168] LocalClient.Create starting
	I0831 15:32:27.679622    3366 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem
	I0831 15:32:27.679677    3366 main.go:141] libmachine: Decoding PEM data...
	I0831 15:32:27.679695    3366 main.go:141] libmachine: Parsing certificate...
	I0831 15:32:27.679755    3366 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem
	I0831 15:32:27.679796    3366 main.go:141] libmachine: Decoding PEM data...
	I0831 15:32:27.679808    3366 main.go:141] libmachine: Parsing certificate...
	I0831 15:32:27.679823    3366 main.go:141] libmachine: Running pre-create checks...
	I0831 15:32:27.679828    3366 main.go:141] libmachine: (ha-949000-m04) Calling .PreCreateCheck
	I0831 15:32:27.679954    3366 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:32:27.680011    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetConfigRaw
	I0831 15:32:27.680447    3366 main.go:141] libmachine: Creating machine...
	I0831 15:32:27.680455    3366 main.go:141] libmachine: (ha-949000-m04) Calling .Create
	I0831 15:32:27.680526    3366 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:32:27.680649    3366 main.go:141] libmachine: (ha-949000-m04) DBG | I0831 15:32:27.680527    3376 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:32:27.680726    3366 main.go:141] libmachine: (ha-949000-m04) Downloading /Users/jenkins/minikube-integration/18943-957/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso...
	I0831 15:32:27.963211    3366 main.go:141] libmachine: (ha-949000-m04) DBG | I0831 15:32:27.963143    3376 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa...
	I0831 15:32:28.138404    3366 main.go:141] libmachine: (ha-949000-m04) DBG | I0831 15:32:28.138339    3376 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/ha-949000-m04.rawdisk...
	I0831 15:32:28.138421    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Writing magic tar header
	I0831 15:32:28.138432    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Writing SSH key tar header
	I0831 15:32:28.139221    3366 main.go:141] libmachine: (ha-949000-m04) DBG | I0831 15:32:28.139121    3376 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04 ...
	I0831 15:32:28.616102    3366 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:32:28.616124    3366 main.go:141] libmachine: (ha-949000-m04) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid
	I0831 15:32:28.616170    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Using UUID 5ee34770-2239-4427-9789-bd204fe095a6
	I0831 15:32:28.643097    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Generated MAC 8a:3c:61:5f:c5:84
	I0831 15:32:28.643119    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:32:28.643168    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:28 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"5ee34770-2239-4427-9789-bd204fe095a6", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00019a630)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:32:28.643197    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:28 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"5ee34770-2239-4427-9789-bd204fe095a6", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00019a630)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:32:28.643246    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:28 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "5ee34770-2239-4427-9789-bd204fe095a6", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/ha-949000-m04.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m04/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:32:28.643295    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:28 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 5ee34770-2239-4427-9789-bd204fe095a6 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/ha-949000-m04.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:32:28.643310    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:28 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:32:28.646360    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:28 DEBUG: hyperkit: Pid is 3377
	I0831 15:32:28.646896    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Attempt 0
	I0831 15:32:28.646910    3366 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:32:28.646988    3366 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3377
	I0831 15:32:28.647914    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Searching for 8a:3c:61:5f:c5:84 in /var/db/dhcpd_leases ...
	I0831 15:32:28.647999    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Found 6 entries in /var/db/dhcpd_leases!
	I0831 15:32:28.648026    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d4eb32}
	I0831 15:32:28.648049    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:32:28.648065    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:32:28.648081    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:32:28.648106    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:32:28.648124    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:32:28.654387    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:28 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:32:28.662691    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:28 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:32:28.663662    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:28 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:32:28.663690    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:28 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:32:28.663699    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:28 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:32:28.663705    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:28 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:32:29.051437    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:29 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:32:29.051452    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:29 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:32:29.166105    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:32:29.166124    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:32:29.166151    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:32:29.166169    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:32:29.166917    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:29 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:32:29.166928    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:29 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:32:30.649656    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Attempt 1
	I0831 15:32:30.649673    3366 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:32:30.649770    3366 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3377
	I0831 15:32:30.650610    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Searching for 8a:3c:61:5f:c5:84 in /var/db/dhcpd_leases ...
	I0831 15:32:30.650652    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Found 6 entries in /var/db/dhcpd_leases!
	I0831 15:32:30.650664    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d4eb32}
	I0831 15:32:30.650672    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:32:30.650679    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:32:30.650691    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:32:30.650699    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:32:30.650717    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:32:32.651331    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Attempt 2
	I0831 15:32:32.651348    3366 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:32:32.651457    3366 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3377
	I0831 15:32:32.652360    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Searching for 8a:3c:61:5f:c5:84 in /var/db/dhcpd_leases ...
	I0831 15:32:32.652409    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Found 6 entries in /var/db/dhcpd_leases!
	I0831 15:32:32.652422    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d4eb32}
	I0831 15:32:32.652430    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:32:32.652445    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:32:32.652452    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:32:32.652475    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:32:32.652487    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:32:34.652766    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Attempt 3
	I0831 15:32:34.652783    3366 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:32:34.652863    3366 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3377
	I0831 15:32:34.653661    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Searching for 8a:3c:61:5f:c5:84 in /var/db/dhcpd_leases ...
	I0831 15:32:34.653687    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Found 6 entries in /var/db/dhcpd_leases!
	I0831 15:32:34.653694    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d4eb32}
	I0831 15:32:34.653703    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:32:34.653713    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:32:34.653720    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:32:34.653728    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:32:34.653736    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:32:34.783242    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:34 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0831 15:32:34.783336    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:34 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0831 15:32:34.783368    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:34 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0831 15:32:34.808079    3366 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:32:34 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0831 15:32:36.655248    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Attempt 4
	I0831 15:32:36.655281    3366 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:32:36.655359    3366 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3377
	I0831 15:32:36.656148    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Searching for 8a:3c:61:5f:c5:84 in /var/db/dhcpd_leases ...
	I0831 15:32:36.656193    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Found 6 entries in /var/db/dhcpd_leases!
	I0831 15:32:36.656208    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d4eb32}
	I0831 15:32:36.656219    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:32:36.656225    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:32:36.656231    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:32:36.656238    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:32:36.656245    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:32:38.657418    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Attempt 5
	I0831 15:32:38.657436    3366 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:32:38.657547    3366 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3377
	I0831 15:32:38.658313    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Searching for 8a:3c:61:5f:c5:84 in /var/db/dhcpd_leases ...
	I0831 15:32:38.658385    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0831 15:32:38.658394    3366 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d4eb85}
	I0831 15:32:38.658403    3366 main.go:141] libmachine: (ha-949000-m04) DBG | Found match: 8a:3c:61:5f:c5:84
	I0831 15:32:38.658408    3366 main.go:141] libmachine: (ha-949000-m04) DBG | IP: 192.169.0.8
	I0831 15:32:38.658441    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetConfigRaw
	I0831 15:32:38.659038    3366 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:32:38.659145    3366 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:32:38.659243    3366 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0831 15:32:38.659255    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetState
	I0831 15:32:38.659364    3366 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:32:38.659436    3366 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3377
	I0831 15:32:38.660228    3366 main.go:141] libmachine: Detecting operating system of created instance...
	I0831 15:32:38.660239    3366 main.go:141] libmachine: Waiting for SSH to be available...
	I0831 15:32:38.660245    3366 main.go:141] libmachine: Getting to WaitForSSH function...
	I0831 15:32:38.660249    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:32:38.660342    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:32:38.660434    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:32:38.660526    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:32:38.660619    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:32:38.660741    3366 main.go:141] libmachine: Using SSH client type: native
	I0831 15:32:38.660954    3366 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x6b52ea0] 0x6b55c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:32:38.660962    3366 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0831 15:32:39.723949    3366 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:32:39.723964    3366 main.go:141] libmachine: Detecting the provisioner...
	I0831 15:32:39.723969    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:32:39.724099    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:32:39.724212    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:32:39.724310    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:32:39.724399    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:32:39.724533    3366 main.go:141] libmachine: Using SSH client type: native
	I0831 15:32:39.724676    3366 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x6b52ea0] 0x6b55c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:32:39.724683    3366 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0831 15:32:39.795378    3366 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0831 15:32:39.795431    3366 main.go:141] libmachine: found compatible host: buildroot
	I0831 15:32:39.795437    3366 main.go:141] libmachine: Provisioning with buildroot...
	I0831 15:32:39.795443    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetMachineName
	I0831 15:32:39.795577    3366 buildroot.go:166] provisioning hostname "ha-949000-m04"
	I0831 15:32:39.795588    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetMachineName
	I0831 15:32:39.795691    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:32:39.795784    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:32:39.795862    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:32:39.795950    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:32:39.796041    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:32:39.796157    3366 main.go:141] libmachine: Using SSH client type: native
	I0831 15:32:39.796305    3366 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x6b52ea0] 0x6b55c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:32:39.796314    3366 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m04 && echo "ha-949000-m04" | sudo tee /etc/hostname
	I0831 15:32:39.870179    3366 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m04
	
	I0831 15:32:39.870196    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:32:39.870327    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:32:39.870421    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:32:39.870516    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:32:39.870583    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:32:39.870685    3366 main.go:141] libmachine: Using SSH client type: native
	I0831 15:32:39.870863    3366 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x6b52ea0] 0x6b55c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:32:39.870876    3366 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m04' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m04/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m04' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:32:39.942005    3366 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:32:39.942027    3366 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:32:39.942044    3366 buildroot.go:174] setting up certificates
	I0831 15:32:39.942058    3366 provision.go:84] configureAuth start
	I0831 15:32:39.942066    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetMachineName
	I0831 15:32:39.942196    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:32:39.942289    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:32:39.942384    3366 provision.go:143] copyHostCerts
	I0831 15:32:39.942414    3366 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:32:39.942467    3366 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:32:39.942474    3366 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:32:39.942606    3366 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:32:39.942807    3366 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:32:39.942837    3366 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:32:39.942842    3366 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:32:39.942918    3366 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:32:39.943056    3366 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:32:39.943094    3366 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:32:39.943099    3366 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:32:39.943170    3366 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:32:39.943304    3366 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m04 san=[127.0.0.1 192.169.0.8 ha-949000-m04 localhost minikube]
	I0831 15:32:40.121924    3366 provision.go:177] copyRemoteCerts
	I0831 15:32:40.121979    3366 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:32:40.121994    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:32:40.122137    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:32:40.122242    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:32:40.122329    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:32:40.122426    3366 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:32:40.161890    3366 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:32:40.161963    3366 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:32:40.181502    3366 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:32:40.181592    3366 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:32:40.201926    3366 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:32:40.202000    3366 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0831 15:32:40.221484    3366 provision.go:87] duration metric: took 279.413535ms to configureAuth
	I0831 15:32:40.221498    3366 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:32:40.221674    3366 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:32:40.221688    3366 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:32:40.221821    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:32:40.221911    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:32:40.221987    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:32:40.222066    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:32:40.222140    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:32:40.222234    3366 main.go:141] libmachine: Using SSH client type: native
	I0831 15:32:40.222361    3366 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x6b52ea0] 0x6b55c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:32:40.222368    3366 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:32:40.286302    3366 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:32:40.286315    3366 buildroot.go:70] root file system type: tmpfs
	I0831 15:32:40.286396    3366 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:32:40.286408    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:32:40.286550    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:32:40.286661    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:32:40.286767    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:32:40.286855    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:32:40.287015    3366 main.go:141] libmachine: Using SSH client type: native
	I0831 15:32:40.287163    3366 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x6b52ea0] 0x6b55c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:32:40.287208    3366 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:32:40.362291    3366 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:32:40.362326    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:32:40.362476    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:32:40.362575    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:32:40.362664    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:32:40.362751    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:32:40.362895    3366 main.go:141] libmachine: Using SSH client type: native
	I0831 15:32:40.363033    3366 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x6b52ea0] 0x6b55c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:32:40.363045    3366 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:32:41.879277    3366 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:32:41.879293    3366 main.go:141] libmachine: Checking connection to Docker...
	I0831 15:32:41.879299    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetURL
	I0831 15:32:41.879433    3366 main.go:141] libmachine: Docker is up and running!
	I0831 15:32:41.879442    3366 main.go:141] libmachine: Reticulating splines...
	I0831 15:32:41.879447    3366 client.go:171] duration metric: took 14.19970268s to LocalClient.Create
	I0831 15:32:41.879461    3366 start.go:167] duration metric: took 14.199746618s to libmachine.API.Create "ha-949000"
	I0831 15:32:41.879471    3366 start.go:293] postStartSetup for "ha-949000-m04" (driver="hyperkit")
	I0831 15:32:41.879480    3366 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:32:41.879493    3366 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:32:41.879663    3366 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:32:41.879675    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:32:41.879755    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:32:41.879851    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:32:41.879944    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:32:41.880025    3366 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:32:41.923123    3366 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:32:41.927529    3366 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:32:41.927548    3366 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:32:41.927665    3366 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:32:41.927853    3366 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:32:41.927860    3366 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:32:41.928086    3366 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:32:41.940937    3366 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:32:41.965524    3366 start.go:296] duration metric: took 86.032735ms for postStartSetup
	I0831 15:32:41.965556    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetConfigRaw
	I0831 15:32:41.966169    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:32:41.966341    3366 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:32:41.966698    3366 start.go:128] duration metric: took 14.318037547s to createHost
	I0831 15:32:41.966715    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:32:41.966800    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:32:41.966890    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:32:41.966986    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:32:41.967061    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:32:41.967165    3366 main.go:141] libmachine: Using SSH client type: native
	I0831 15:32:41.967297    3366 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x6b52ea0] 0x6b55c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:32:41.967304    3366 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:32:42.032132    3366 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143561.998776907
	
	I0831 15:32:42.032143    3366 fix.go:216] guest clock: 1725143561.998776907
	I0831 15:32:42.032149    3366 fix.go:229] Guest: 2024-08-31 15:32:41.998776907 -0700 PDT Remote: 2024-08-31 15:32:41.966709 -0700 PDT m=+14.599772738 (delta=32.067907ms)
	I0831 15:32:42.032170    3366 fix.go:200] guest clock delta is within tolerance: 32.067907ms
	I0831 15:32:42.032174    3366 start.go:83] releasing machines lock for "ha-949000-m04", held for 14.383698429s
	I0831 15:32:42.032195    3366 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:32:42.032325    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:32:42.032430    3366 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:32:42.032736    3366 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:32:42.032841    3366 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:32:42.032940    3366 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:32:42.032994    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:32:42.032999    3366 ssh_runner.go:195] Run: systemctl --version
	I0831 15:32:42.033008    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:32:42.033115    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:32:42.033127    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:32:42.033208    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:32:42.033234    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:32:42.033317    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:32:42.033332    3366 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:32:42.033424    3366 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:32:42.033440    3366 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:32:42.111032    3366 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0831 15:32:42.116037    3366 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:32:42.116080    3366 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:32:42.128247    3366 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:32:42.128263    3366 start.go:495] detecting cgroup driver to use...
	I0831 15:32:42.128376    3366 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:32:42.143946    3366 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:32:42.152474    3366 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:32:42.160955    3366 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:32:42.161014    3366 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:32:42.169309    3366 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:32:42.178014    3366 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:32:42.186670    3366 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:32:42.194918    3366 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:32:42.203498    3366 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:32:42.211762    3366 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:32:42.219963    3366 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:32:42.228521    3366 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:32:42.235938    3366 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:32:42.243421    3366 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:32:42.336408    3366 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:32:42.355668    3366 start.go:495] detecting cgroup driver to use...
	I0831 15:32:42.355761    3366 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:32:42.371833    3366 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:32:42.384339    3366 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:32:42.400480    3366 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:32:42.412071    3366 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:32:42.423689    3366 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:32:42.449611    3366 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:32:42.460124    3366 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:32:42.475396    3366 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:32:42.478284    3366 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:32:42.485697    3366 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:32:42.499251    3366 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:32:42.597597    3366 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:32:42.703914    3366 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:32:42.704003    3366 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:32:42.719503    3366 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:32:42.837595    3366 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:33:43.798516    3366 ssh_runner.go:235] Completed: sudo systemctl restart docker: (1m0.960253412s)
	I0831 15:33:43.798596    3366 ssh_runner.go:195] Run: sudo journalctl --no-pager -u docker
	I0831 15:33:43.833858    3366 out.go:201] 
	W0831 15:33:43.854425    3366 out.go:270] X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart docker: Process exited with status 1
	stdout:
	
	stderr:
	Job for docker.service failed because the control process exited with error code.
	See "systemctl status docker.service" and "journalctl -xeu docker.service" for details.
	
	sudo journalctl --no-pager -u docker:
	-- stdout --
	Aug 31 22:32:40 ha-949000-m04 systemd[1]: Starting Docker Application Container Engine...
	Aug 31 22:32:40 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:40.611593570Z" level=info msg="Starting up"
	Aug 31 22:32:40 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:40.612034104Z" level=info msg="containerd not running, starting managed containerd"
	Aug 31 22:32:40 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:40.612614156Z" level=info msg="started new containerd process" address=/var/run/docker/containerd/containerd.sock module=libcontainerd pid=516
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.627739551Z" level=info msg="starting containerd" revision=472731909fa34bd7bc9c087e4c27943f9835f111 version=v1.7.21
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.642887159Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.642981210Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.643048375Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.643090698Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.643172119Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.643208440Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.643359200Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.643399276Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.643461777Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.643500192Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.643579219Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.643758547Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.645385834Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/5.10.207\\n\"): skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.645439895Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.645579646Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.645624246Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.645717591Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.645785391Z" level=info msg="metadata content store policy set" policy=shared
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648075342Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648172725Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648224602Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648258935Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648290722Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648384550Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648622534Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648727838Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648765775Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648797087Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648830673Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648863418Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648895157Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648929677Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648960987Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648993049Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649026203Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649056023Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649098607Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649134451Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649167957Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649199477Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649231471Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649261946Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649298665Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649331713Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649362778Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649393763Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649422712Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649451947Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649481967Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649545391Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649593337Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649626091Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649656082Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649729350Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649777417Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649808316Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649837726Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649866129Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649895071Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649923246Z" level=info msg="NRI interface is disabled by configuration."
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.650101800Z" level=info msg=serving... address=/var/run/docker/containerd/containerd-debug.sock
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.650189739Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock.ttrpc
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.650249105Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.650317954Z" level=info msg="containerd successfully booted in 0.023352s"
	Aug 31 22:32:41 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:41.639163786Z" level=info msg="[graphdriver] trying configured driver: overlay2"
	Aug 31 22:32:41 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:41.643282460Z" level=info msg="Loading containers: start."
	Aug 31 22:32:41 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:41.724581287Z" level=warning msg="ip6tables is enabled, but cannot set up ip6tables chains" error="failed to create NAT chain DOCKER: iptables failed: ip6tables --wait -t nat -N DOCKER: ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)\nPerhaps ip6tables or your kernel needs to be upgraded.\n (exit status 3)"
	Aug 31 22:32:41 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:41.806710739Z" level=info msg="Loading containers: done."
	Aug 31 22:32:41 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:41.815043773Z" level=info msg="Docker daemon" commit=3ab5c7d0 containerd-snapshotter=false storage-driver=overlay2 version=27.2.0
	Aug 31 22:32:41 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:41.815132617Z" level=info msg="Daemon has completed initialization"
	Aug 31 22:32:41 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:41.841569438Z" level=info msg="API listen on /var/run/docker.sock"
	Aug 31 22:32:41 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:41.841748961Z" level=info msg="API listen on [::]:2376"
	Aug 31 22:32:41 ha-949000-m04 systemd[1]: Started Docker Application Container Engine.
	Aug 31 22:32:42 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:42.827442547Z" level=info msg="Processing signal 'terminated'"
	Aug 31 22:32:42 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:42.828167343Z" level=info msg="stopping event stream following graceful shutdown" error="<nil>" module=libcontainerd namespace=moby
	Aug 31 22:32:42 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:42.828650151Z" level=info msg="Daemon shutdown complete"
	Aug 31 22:32:42 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:42.828743080Z" level=info msg="stopping healthcheck following graceful shutdown" module=libcontainerd
	Aug 31 22:32:42 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:42.828761097Z" level=info msg="stopping event stream following graceful shutdown" error="context canceled" module=libcontainerd namespace=plugins.moby
	Aug 31 22:32:42 ha-949000-m04 systemd[1]: Stopping Docker Application Container Engine...
	Aug 31 22:32:43 ha-949000-m04 systemd[1]: docker.service: Deactivated successfully.
	Aug 31 22:32:43 ha-949000-m04 systemd[1]: Stopped Docker Application Container Engine.
	Aug 31 22:32:43 ha-949000-m04 systemd[1]: Starting Docker Application Container Engine...
	Aug 31 22:32:43 ha-949000-m04 dockerd[905]: time="2024-08-31T22:32:43.868231097Z" level=info msg="Starting up"
	Aug 31 22:33:43 ha-949000-m04 dockerd[905]: failed to start daemon: failed to dial "/run/containerd/containerd.sock": failed to dial "/run/containerd/containerd.sock": context deadline exceeded
	Aug 31 22:33:43 ha-949000-m04 systemd[1]: docker.service: Main process exited, code=exited, status=1/FAILURE
	Aug 31 22:33:43 ha-949000-m04 systemd[1]: docker.service: Failed with result 'exit-code'.
	Aug 31 22:33:43 ha-949000-m04 systemd[1]: Failed to start Docker Application Container Engine.
	
	-- /stdout --
	X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart docker: Process exited with status 1
	stdout:
	
	stderr:
	Job for docker.service failed because the control process exited with error code.
	See "systemctl status docker.service" and "journalctl -xeu docker.service" for details.
	
	sudo journalctl --no-pager -u docker:
	-- stdout --
	Aug 31 22:32:40 ha-949000-m04 systemd[1]: Starting Docker Application Container Engine...
	Aug 31 22:32:40 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:40.611593570Z" level=info msg="Starting up"
	Aug 31 22:32:40 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:40.612034104Z" level=info msg="containerd not running, starting managed containerd"
	Aug 31 22:32:40 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:40.612614156Z" level=info msg="started new containerd process" address=/var/run/docker/containerd/containerd.sock module=libcontainerd pid=516
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.627739551Z" level=info msg="starting containerd" revision=472731909fa34bd7bc9c087e4c27943f9835f111 version=v1.7.21
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.642887159Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.642981210Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.643048375Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.643090698Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.643172119Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.643208440Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.643359200Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.643399276Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.643461777Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.643500192Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.643579219Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.643758547Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.645385834Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/5.10.207\\n\"): skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.645439895Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.645579646Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.645624246Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.645717591Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.645785391Z" level=info msg="metadata content store policy set" policy=shared
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648075342Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648172725Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648224602Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648258935Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648290722Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648384550Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648622534Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648727838Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648765775Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648797087Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648830673Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648863418Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648895157Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648929677Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648960987Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.648993049Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649026203Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649056023Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649098607Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649134451Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649167957Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649199477Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649231471Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649261946Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649298665Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649331713Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649362778Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649393763Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649422712Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649451947Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649481967Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649545391Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649593337Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649626091Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649656082Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649729350Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649777417Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649808316Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649837726Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649866129Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649895071Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.649923246Z" level=info msg="NRI interface is disabled by configuration."
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.650101800Z" level=info msg=serving... address=/var/run/docker/containerd/containerd-debug.sock
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.650189739Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock.ttrpc
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.650249105Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock
	Aug 31 22:32:40 ha-949000-m04 dockerd[516]: time="2024-08-31T22:32:40.650317954Z" level=info msg="containerd successfully booted in 0.023352s"
	Aug 31 22:32:41 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:41.639163786Z" level=info msg="[graphdriver] trying configured driver: overlay2"
	Aug 31 22:32:41 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:41.643282460Z" level=info msg="Loading containers: start."
	Aug 31 22:32:41 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:41.724581287Z" level=warning msg="ip6tables is enabled, but cannot set up ip6tables chains" error="failed to create NAT chain DOCKER: iptables failed: ip6tables --wait -t nat -N DOCKER: ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)\nPerhaps ip6tables or your kernel needs to be upgraded.\n (exit status 3)"
	Aug 31 22:32:41 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:41.806710739Z" level=info msg="Loading containers: done."
	Aug 31 22:32:41 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:41.815043773Z" level=info msg="Docker daemon" commit=3ab5c7d0 containerd-snapshotter=false storage-driver=overlay2 version=27.2.0
	Aug 31 22:32:41 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:41.815132617Z" level=info msg="Daemon has completed initialization"
	Aug 31 22:32:41 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:41.841569438Z" level=info msg="API listen on /var/run/docker.sock"
	Aug 31 22:32:41 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:41.841748961Z" level=info msg="API listen on [::]:2376"
	Aug 31 22:32:41 ha-949000-m04 systemd[1]: Started Docker Application Container Engine.
	Aug 31 22:32:42 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:42.827442547Z" level=info msg="Processing signal 'terminated'"
	Aug 31 22:32:42 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:42.828167343Z" level=info msg="stopping event stream following graceful shutdown" error="<nil>" module=libcontainerd namespace=moby
	Aug 31 22:32:42 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:42.828650151Z" level=info msg="Daemon shutdown complete"
	Aug 31 22:32:42 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:42.828743080Z" level=info msg="stopping healthcheck following graceful shutdown" module=libcontainerd
	Aug 31 22:32:42 ha-949000-m04 dockerd[509]: time="2024-08-31T22:32:42.828761097Z" level=info msg="stopping event stream following graceful shutdown" error="context canceled" module=libcontainerd namespace=plugins.moby
	Aug 31 22:32:42 ha-949000-m04 systemd[1]: Stopping Docker Application Container Engine...
	Aug 31 22:32:43 ha-949000-m04 systemd[1]: docker.service: Deactivated successfully.
	Aug 31 22:32:43 ha-949000-m04 systemd[1]: Stopped Docker Application Container Engine.
	Aug 31 22:32:43 ha-949000-m04 systemd[1]: Starting Docker Application Container Engine...
	Aug 31 22:32:43 ha-949000-m04 dockerd[905]: time="2024-08-31T22:32:43.868231097Z" level=info msg="Starting up"
	Aug 31 22:33:43 ha-949000-m04 dockerd[905]: failed to start daemon: failed to dial "/run/containerd/containerd.sock": failed to dial "/run/containerd/containerd.sock": context deadline exceeded
	Aug 31 22:33:43 ha-949000-m04 systemd[1]: docker.service: Main process exited, code=exited, status=1/FAILURE
	Aug 31 22:33:43 ha-949000-m04 systemd[1]: docker.service: Failed with result 'exit-code'.
	Aug 31 22:33:43 ha-949000-m04 systemd[1]: Failed to start Docker Application Container Engine.
	
	-- /stdout --
	W0831 15:33:43.854470    3366 out.go:270] * 
	* 
	W0831 15:33:43.857008    3366 out.go:293] ╭─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                         │
	│    * If the above advice does not help, please let us know:                                                             │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                           │
	│                                                                                                                         │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                │
	│    * Please also attach the following file to the GitHub issue:                                                         │
	│    * - /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube_node_040ea7097fd6ed71e65be9a474587f81f0ccd21d_0.log    │
	│                                                                                                                         │
	╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                         │
	│    * If the above advice does not help, please let us know:                                                             │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                           │
	│                                                                                                                         │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                │
	│    * Please also attach the following file to the GitHub issue:                                                         │
	│    * - /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube_node_040ea7097fd6ed71e65be9a474587f81f0ccd21d_0.log    │
	│                                                                                                                         │
	╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
	I0831 15:33:43.878647    3366 out.go:201] 

                                                
                                                
** /stderr **
ha_test.go:230: failed to add worker node to current ha (multi-control plane) cluster. args "out/minikube-darwin-amd64 node add -p ha-949000 -v=7 --alsologtostderr" : exit status 90
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-949000 -n ha-949000
helpers_test.go:245: <<< TestMultiControlPlane/serial/AddWorkerNode FAILED: start of post-mortem logs <<<
helpers_test.go:246: ======>  post-mortem[TestMultiControlPlane/serial/AddWorkerNode]: minikube logs <======
helpers_test.go:248: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 logs -n 25
helpers_test.go:248: (dbg) Done: out/minikube-darwin-amd64 -p ha-949000 logs -n 25: (2.461076409s)
helpers_test.go:253: TestMultiControlPlane/serial/AddWorkerNode logs: 
-- stdout --
	
	==> Audit <==
	|----------------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	|    Command     |                 Args                 |      Profile      |  User   | Version |     Start Time      |      End Time       |
	|----------------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	| update-context | functional-593000                    | functional-593000 | jenkins | v1.33.1 | 31 Aug 24 15:28 PDT | 31 Aug 24 15:28 PDT |
	|                | update-context                       |                   |         |         |                     |                     |
	|                | --alsologtostderr -v=2               |                   |         |         |                     |                     |
	| update-context | functional-593000                    | functional-593000 | jenkins | v1.33.1 | 31 Aug 24 15:28 PDT | 31 Aug 24 15:28 PDT |
	|                | update-context                       |                   |         |         |                     |                     |
	|                | --alsologtostderr -v=2               |                   |         |         |                     |                     |
	| delete         | -p functional-593000                 | functional-593000 | jenkins | v1.33.1 | 31 Aug 24 15:29 PDT | 31 Aug 24 15:29 PDT |
	| start          | -p ha-949000 --wait=true             | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:29 PDT | 31 Aug 24 15:32 PDT |
	|                | --memory=2200 --ha                   |                   |         |         |                     |                     |
	|                | -v=7 --alsologtostderr               |                   |         |         |                     |                     |
	|                | --driver=hyperkit                    |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- apply -f             | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | ./testdata/ha/ha-pod-dns-test.yaml   |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- rollout status       | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | deployment/busybox                   |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- get pods -o          | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | jsonpath='{.items[*].status.podIP}'  |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- get pods -o          | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | jsonpath='{.items[*].metadata.name}' |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-5kkbw --           |                   |         |         |                     |                     |
	|                | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-6r9s5 --           |                   |         |         |                     |                     |
	|                | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-vjf9x --           |                   |         |         |                     |                     |
	|                | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-5kkbw --           |                   |         |         |                     |                     |
	|                | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-6r9s5 --           |                   |         |         |                     |                     |
	|                | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-vjf9x --           |                   |         |         |                     |                     |
	|                | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-5kkbw -- nslookup  |                   |         |         |                     |                     |
	|                | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-6r9s5 -- nslookup  |                   |         |         |                     |                     |
	|                | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-vjf9x -- nslookup  |                   |         |         |                     |                     |
	|                | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- get pods -o          | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | jsonpath='{.items[*].metadata.name}' |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-5kkbw              |                   |         |         |                     |                     |
	|                | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|                | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|                | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-5kkbw -- sh        |                   |         |         |                     |                     |
	|                | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-6r9s5              |                   |         |         |                     |                     |
	|                | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|                | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|                | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-6r9s5 -- sh        |                   |         |         |                     |                     |
	|                | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-vjf9x              |                   |         |         |                     |                     |
	|                | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|                | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|                | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-vjf9x -- sh        |                   |         |         |                     |                     |
	|                | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| node           | add -p ha-949000 -v=7                | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT |                     |
	|                | --alsologtostderr                    |                   |         |         |                     |                     |
	|----------------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/08/31 15:29:09
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0831 15:29:09.276641    2876 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:29:09.276909    2876 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:29:09.276915    2876 out.go:358] Setting ErrFile to fd 2...
	I0831 15:29:09.276919    2876 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:29:09.277077    2876 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:29:09.278657    2876 out.go:352] Setting JSON to false
	I0831 15:29:09.304076    2876 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":1720,"bootTime":1725141629,"procs":442,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.6.1","kernelVersion":"23.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0831 15:29:09.304206    2876 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0831 15:29:09.363205    2876 out.go:177] * [ha-949000] minikube v1.33.1 on Darwin 14.6.1
	I0831 15:29:09.404287    2876 notify.go:220] Checking for updates...
	I0831 15:29:09.428120    2876 out.go:177]   - MINIKUBE_LOCATION=18943
	I0831 15:29:09.489040    2876 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:29:09.566857    2876 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0831 15:29:09.611464    2876 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0831 15:29:09.632356    2876 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:29:09.653358    2876 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0831 15:29:09.674652    2876 driver.go:392] Setting default libvirt URI to qemu:///system
	I0831 15:29:09.704277    2876 out.go:177] * Using the hyperkit driver based on user configuration
	I0831 15:29:09.746520    2876 start.go:297] selected driver: hyperkit
	I0831 15:29:09.746549    2876 start.go:901] validating driver "hyperkit" against <nil>
	I0831 15:29:09.746572    2876 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0831 15:29:09.750947    2876 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:29:09.751059    2876 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18943-957/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0831 15:29:09.759462    2876 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0831 15:29:09.763334    2876 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:09.763355    2876 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0831 15:29:09.763386    2876 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0831 15:29:09.763603    2876 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:29:09.763661    2876 cni.go:84] Creating CNI manager for ""
	I0831 15:29:09.763670    2876 cni.go:136] multinode detected (0 nodes found), recommending kindnet
	I0831 15:29:09.763676    2876 start_flags.go:319] Found "CNI" CNI - setting NetworkPlugin=cni
	I0831 15:29:09.763757    2876 start.go:340] cluster config:
	{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docke
r CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0
GPUs: AutoPauseInterval:1m0s}
	I0831 15:29:09.763847    2876 iso.go:125] acquiring lock: {Name:mk6e91575b208577856769ef01f8e000bc57c787 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:29:09.806188    2876 out.go:177] * Starting "ha-949000" primary control-plane node in "ha-949000" cluster
	I0831 15:29:09.827330    2876 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:29:09.827400    2876 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4
	I0831 15:29:09.827429    2876 cache.go:56] Caching tarball of preloaded images
	I0831 15:29:09.827640    2876 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:29:09.827663    2876 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:29:09.828200    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:29:09.828242    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json: {Name:mka3af2c42dba1cbf0f487cd55ddf735793024ce Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:09.828849    2876 start.go:360] acquireMachinesLock for ha-949000: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:29:09.828952    2876 start.go:364] duration metric: took 84.577µs to acquireMachinesLock for "ha-949000"
	I0831 15:29:09.828988    2876 start.go:93] Provisioning new machine with config: &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType
:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:29:09.829059    2876 start.go:125] createHost starting for "" (driver="hyperkit")
	I0831 15:29:09.903354    2876 out.go:235] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0831 15:29:09.903628    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:09.903698    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:09.913643    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51029
	I0831 15:29:09.913991    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:09.914387    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:09.914395    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:09.914636    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:09.914768    2876 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:29:09.914873    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:09.915000    2876 start.go:159] libmachine.API.Create for "ha-949000" (driver="hyperkit")
	I0831 15:29:09.915023    2876 client.go:168] LocalClient.Create starting
	I0831 15:29:09.915061    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem
	I0831 15:29:09.915112    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:29:09.915129    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:29:09.915188    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem
	I0831 15:29:09.915229    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:29:09.915249    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:29:09.915265    2876 main.go:141] libmachine: Running pre-create checks...
	I0831 15:29:09.915270    2876 main.go:141] libmachine: (ha-949000) Calling .PreCreateCheck
	I0831 15:29:09.915359    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:09.915528    2876 main.go:141] libmachine: (ha-949000) Calling .GetConfigRaw
	I0831 15:29:09.915949    2876 main.go:141] libmachine: Creating machine...
	I0831 15:29:09.915958    2876 main.go:141] libmachine: (ha-949000) Calling .Create
	I0831 15:29:09.916028    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:09.916144    2876 main.go:141] libmachine: (ha-949000) DBG | I0831 15:29:09.916024    2884 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:29:09.916224    2876 main.go:141] libmachine: (ha-949000) Downloading /Users/jenkins/minikube-integration/18943-957/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso...
	I0831 15:29:10.099863    2876 main.go:141] libmachine: (ha-949000) DBG | I0831 15:29:10.099790    2884 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa...
	I0831 15:29:10.256390    2876 main.go:141] libmachine: (ha-949000) DBG | I0831 15:29:10.256317    2884 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk...
	I0831 15:29:10.256437    2876 main.go:141] libmachine: (ha-949000) DBG | Writing magic tar header
	I0831 15:29:10.256445    2876 main.go:141] libmachine: (ha-949000) DBG | Writing SSH key tar header
	I0831 15:29:10.257253    2876 main.go:141] libmachine: (ha-949000) DBG | I0831 15:29:10.257126    2884 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000 ...
	I0831 15:29:10.614937    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:10.614967    2876 main.go:141] libmachine: (ha-949000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid
	I0831 15:29:10.615070    2876 main.go:141] libmachine: (ha-949000) DBG | Using UUID 98cab9ba-901d-49d1-9e6c-321a4533d56e
	I0831 15:29:10.724629    2876 main.go:141] libmachine: (ha-949000) DBG | Generated MAC ce:8:77:f7:42:5e
	I0831 15:29:10.724653    2876 main.go:141] libmachine: (ha-949000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:29:10.724744    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"98cab9ba-901d-49d1-9e6c-321a4533d56e", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001ae630)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:29:10.724785    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"98cab9ba-901d-49d1-9e6c-321a4533d56e", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001ae630)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:29:10.724823    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "98cab9ba-901d-49d1-9e6c-321a4533d56e", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd,earlyprintk=serial l
oglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:29:10.724851    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 98cab9ba-901d-49d1-9e6c-321a4533d56e -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset noresto
re waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:29:10.724862    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:29:10.727687    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: Pid is 2887
	I0831 15:29:10.728136    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 0
	I0831 15:29:10.728145    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:10.728201    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:10.729180    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:10.729276    2876 main.go:141] libmachine: (ha-949000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0831 15:29:10.729293    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:10.729309    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:10.729317    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:10.735289    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:29:10.788351    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:29:10.788955    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:29:10.788972    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:29:10.788980    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:29:10.788989    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:29:11.164652    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:29:11.164668    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:29:11.279214    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:29:11.279233    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:29:11.279245    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:29:11.279263    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:29:11.280165    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:29:11.280176    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:29:12.729552    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 1
	I0831 15:29:12.729568    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:12.729694    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:12.730495    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:12.730552    2876 main.go:141] libmachine: (ha-949000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0831 15:29:12.730566    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:12.730580    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:12.730595    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:14.731472    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 2
	I0831 15:29:14.731486    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:14.731548    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:14.732412    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:14.732458    2876 main.go:141] libmachine: (ha-949000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0831 15:29:14.732473    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:14.732492    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:14.732506    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:16.732786    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 3
	I0831 15:29:16.732802    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:16.732855    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:16.733685    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:16.733713    2876 main.go:141] libmachine: (ha-949000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0831 15:29:16.733721    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:16.733748    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:16.733759    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:16.839902    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:16 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:29:16.839946    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:16 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:29:16.839959    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:16 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:29:16.864989    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:16 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:29:18.735154    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 4
	I0831 15:29:18.735170    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:18.735286    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:18.736038    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:18.736084    2876 main.go:141] libmachine: (ha-949000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0831 15:29:18.736094    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:18.736103    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:18.736112    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:20.736683    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 5
	I0831 15:29:20.736698    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:20.736791    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:20.737588    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:20.737620    2876 main.go:141] libmachine: (ha-949000) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:20.737633    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:20.737640    2876 main.go:141] libmachine: (ha-949000) DBG | Found match: ce:8:77:f7:42:5e
	I0831 15:29:20.737645    2876 main.go:141] libmachine: (ha-949000) DBG | IP: 192.169.0.5
	I0831 15:29:20.737694    2876 main.go:141] libmachine: (ha-949000) Calling .GetConfigRaw
	I0831 15:29:20.738300    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:20.738400    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:20.738493    2876 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0831 15:29:20.738503    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:29:20.738582    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:20.738639    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:20.739400    2876 main.go:141] libmachine: Detecting operating system of created instance...
	I0831 15:29:20.739409    2876 main.go:141] libmachine: Waiting for SSH to be available...
	I0831 15:29:20.739415    2876 main.go:141] libmachine: Getting to WaitForSSH function...
	I0831 15:29:20.739420    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:20.739500    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:20.739608    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:20.739694    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:20.739784    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:20.739906    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:20.740082    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:20.740088    2876 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0831 15:29:21.810169    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:29:21.810183    2876 main.go:141] libmachine: Detecting the provisioner...
	I0831 15:29:21.810190    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:21.810319    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:21.810409    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.810520    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.810622    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:21.810753    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:21.810899    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:21.810907    2876 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0831 15:29:21.876064    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0831 15:29:21.876103    2876 main.go:141] libmachine: found compatible host: buildroot
	I0831 15:29:21.876110    2876 main.go:141] libmachine: Provisioning with buildroot...
	I0831 15:29:21.876116    2876 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:29:21.876252    2876 buildroot.go:166] provisioning hostname "ha-949000"
	I0831 15:29:21.876263    2876 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:29:21.876353    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:21.876438    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:21.876542    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.876625    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.876705    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:21.876835    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:21.876977    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:21.876986    2876 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000 && echo "ha-949000" | sudo tee /etc/hostname
	I0831 15:29:21.955731    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000
	
	I0831 15:29:21.955752    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:21.955889    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:21.955998    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.956098    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.956196    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:21.956332    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:21.956482    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:21.956494    2876 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:29:22.031652    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:29:22.031674    2876 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:29:22.031695    2876 buildroot.go:174] setting up certificates
	I0831 15:29:22.031704    2876 provision.go:84] configureAuth start
	I0831 15:29:22.031711    2876 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:29:22.031840    2876 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:29:22.031922    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:22.032006    2876 provision.go:143] copyHostCerts
	I0831 15:29:22.032046    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:29:22.032109    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:29:22.032118    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:29:22.032257    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:29:22.032465    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:29:22.032502    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:29:22.032507    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:29:22.032592    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:29:22.032752    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:29:22.032790    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:29:22.032795    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:29:22.032874    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:29:22.033015    2876 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000 san=[127.0.0.1 192.169.0.5 ha-949000 localhost minikube]
	I0831 15:29:22.113278    2876 provision.go:177] copyRemoteCerts
	I0831 15:29:22.113334    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:29:22.113349    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:22.113477    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:22.113572    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.113653    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:22.113746    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:22.153055    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:29:22.153132    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:29:22.173186    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:29:22.173254    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1196 bytes)
	I0831 15:29:22.192526    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:29:22.192581    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0831 15:29:22.212150    2876 provision.go:87] duration metric: took 180.428736ms to configureAuth
	I0831 15:29:22.212163    2876 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:29:22.212301    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:29:22.212314    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:22.212441    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:22.212522    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:22.212600    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.212680    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.212760    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:22.212882    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:22.213008    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:22.213015    2876 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:29:22.281023    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:29:22.281035    2876 buildroot.go:70] root file system type: tmpfs
	I0831 15:29:22.281108    2876 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:29:22.281121    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:22.281265    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:22.281355    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.281474    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.281559    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:22.281695    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:22.281836    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:22.281881    2876 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:29:22.358523    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:29:22.358550    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:22.358687    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:22.358785    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.358873    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.358967    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:22.359137    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:22.359281    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:22.359293    2876 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:29:23.900860    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:29:23.900883    2876 main.go:141] libmachine: Checking connection to Docker...
	I0831 15:29:23.900890    2876 main.go:141] libmachine: (ha-949000) Calling .GetURL
	I0831 15:29:23.901027    2876 main.go:141] libmachine: Docker is up and running!
	I0831 15:29:23.901035    2876 main.go:141] libmachine: Reticulating splines...
	I0831 15:29:23.901040    2876 client.go:171] duration metric: took 13.985813631s to LocalClient.Create
	I0831 15:29:23.901051    2876 start.go:167] duration metric: took 13.985855387s to libmachine.API.Create "ha-949000"
	I0831 15:29:23.901061    2876 start.go:293] postStartSetup for "ha-949000" (driver="hyperkit")
	I0831 15:29:23.901070    2876 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:29:23.901080    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:23.901239    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:29:23.901251    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:23.901337    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:23.901438    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:23.901525    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:23.901622    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:23.947237    2876 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:29:23.951946    2876 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:29:23.951965    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:29:23.952069    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:29:23.952248    2876 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:29:23.952255    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:29:23.952462    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:29:23.961814    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:29:23.990864    2876 start.go:296] duration metric: took 89.791408ms for postStartSetup
	I0831 15:29:23.990895    2876 main.go:141] libmachine: (ha-949000) Calling .GetConfigRaw
	I0831 15:29:23.991499    2876 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:29:23.991642    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:29:23.991961    2876 start.go:128] duration metric: took 14.162686523s to createHost
	I0831 15:29:23.991974    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:23.992084    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:23.992175    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:23.992259    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:23.992348    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:23.992457    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:23.992584    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:23.992591    2876 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:29:24.059500    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143363.867477750
	
	I0831 15:29:24.059512    2876 fix.go:216] guest clock: 1725143363.867477750
	I0831 15:29:24.059517    2876 fix.go:229] Guest: 2024-08-31 15:29:23.86747775 -0700 PDT Remote: 2024-08-31 15:29:23.991969 -0700 PDT m=+14.752935961 (delta=-124.49125ms)
	I0831 15:29:24.059536    2876 fix.go:200] guest clock delta is within tolerance: -124.49125ms
	I0831 15:29:24.059546    2876 start.go:83] releasing machines lock for "ha-949000", held for 14.230377343s
	I0831 15:29:24.059565    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:24.059706    2876 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:29:24.059819    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:24.060132    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:24.060244    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:24.060319    2876 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:29:24.060346    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:24.060384    2876 ssh_runner.go:195] Run: cat /version.json
	I0831 15:29:24.060396    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:24.060439    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:24.060498    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:24.060525    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:24.060623    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:24.060654    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:24.060746    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:24.060765    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:24.060837    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:24.096035    2876 ssh_runner.go:195] Run: systemctl --version
	I0831 15:29:24.148302    2876 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0831 15:29:24.153275    2876 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:29:24.153315    2876 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:29:24.165840    2876 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:29:24.165854    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:29:24.165972    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:29:24.181258    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:29:24.191149    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:29:24.200150    2876 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:29:24.200197    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:29:24.209198    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:29:24.217930    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:29:24.227002    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:29:24.237048    2876 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:29:24.246383    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:29:24.255322    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:29:24.264369    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:29:24.273487    2876 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:29:24.282138    2876 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:29:24.290220    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:24.385700    2876 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:29:24.407032    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:29:24.407111    2876 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:29:24.421439    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:29:24.437414    2876 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:29:24.451401    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:29:24.463382    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:29:24.474406    2876 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:29:24.507277    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:29:24.517707    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:29:24.532548    2876 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:29:24.535464    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:29:24.542699    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:29:24.557395    2876 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:29:24.662440    2876 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:29:24.769422    2876 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:29:24.769500    2876 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:29:24.784888    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:24.881202    2876 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:29:27.276172    2876 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.394917578s)
	I0831 15:29:27.276233    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:29:27.287739    2876 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:29:27.301676    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:29:27.312754    2876 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:29:27.407771    2876 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:29:27.503429    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:27.614933    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:29:27.628621    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:29:27.641141    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:27.759998    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:29:27.816359    2876 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:29:27.816437    2876 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:29:27.820881    2876 start.go:563] Will wait 60s for crictl version
	I0831 15:29:27.820929    2876 ssh_runner.go:195] Run: which crictl
	I0831 15:29:27.824109    2876 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:29:27.852863    2876 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:29:27.852937    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:29:27.870865    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:29:27.937728    2876 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:29:27.937791    2876 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:29:27.938219    2876 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:29:27.943196    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:29:27.954353    2876 kubeadm.go:883] updating cluster {Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.
0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 Moun
tType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0831 15:29:27.954419    2876 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:29:27.954480    2876 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 15:29:27.967028    2876 docker.go:685] Got preloaded images: 
	I0831 15:29:27.967040    2876 docker.go:691] registry.k8s.io/kube-apiserver:v1.31.0 wasn't preloaded
	I0831 15:29:27.967094    2876 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0831 15:29:27.975409    2876 ssh_runner.go:195] Run: which lz4
	I0831 15:29:27.978323    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 -> /preloaded.tar.lz4
	I0831 15:29:27.978434    2876 ssh_runner.go:195] Run: stat -c "%s %y" /preloaded.tar.lz4
	I0831 15:29:27.981530    2876 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%s %y" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0831 15:29:27.981546    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (342554258 bytes)
	I0831 15:29:28.829399    2876 docker.go:649] duration metric: took 850.988233ms to copy over tarball
	I0831 15:29:28.829466    2876 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I0831 15:29:31.094292    2876 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.264775779s)
	I0831 15:29:31.094306    2876 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0831 15:29:31.120523    2876 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0831 15:29:31.129444    2876 ssh_runner.go:362] scp memory --> /var/lib/docker/image/overlay2/repositories.json (2631 bytes)
	I0831 15:29:31.144462    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:31.255144    2876 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:29:33.625508    2876 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.370311255s)
	I0831 15:29:33.625595    2876 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 15:29:33.642024    2876 docker.go:685] Got preloaded images: -- stdout --
	registry.k8s.io/kube-controller-manager:v1.31.0
	registry.k8s.io/kube-scheduler:v1.31.0
	registry.k8s.io/kube-apiserver:v1.31.0
	registry.k8s.io/kube-proxy:v1.31.0
	registry.k8s.io/etcd:3.5.15-0
	registry.k8s.io/pause:3.10
	registry.k8s.io/coredns/coredns:v1.11.1
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0831 15:29:33.642043    2876 cache_images.go:84] Images are preloaded, skipping loading
	I0831 15:29:33.642059    2876 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.31.0 docker true true} ...
	I0831 15:29:33.642140    2876 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:29:33.642205    2876 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0831 15:29:33.687213    2876 cni.go:84] Creating CNI manager for ""
	I0831 15:29:33.687227    2876 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0831 15:29:33.687238    2876 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0831 15:29:33.687253    2876 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.31.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-949000 NodeName:ha-949000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0831 15:29:33.687355    2876 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-949000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.31.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0831 15:29:33.687380    2876 kube-vip.go:115] generating kube-vip config ...
	I0831 15:29:33.687436    2876 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:29:33.701609    2876 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:29:33.701679    2876 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/super-admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:29:33.701731    2876 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:29:33.709907    2876 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:29:33.709972    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0831 15:29:33.717287    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0831 15:29:33.730443    2876 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:29:33.743765    2876 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0831 15:29:33.758082    2876 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1446 bytes)
	I0831 15:29:33.771561    2876 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:29:33.774412    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:29:33.783869    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:33.875944    2876 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:29:33.891425    2876 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.5
	I0831 15:29:33.891438    2876 certs.go:194] generating shared ca certs ...
	I0831 15:29:33.891448    2876 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:33.891633    2876 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:29:33.891710    2876 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:29:33.891723    2876 certs.go:256] generating profile certs ...
	I0831 15:29:33.891775    2876 certs.go:363] generating signed profile cert for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:29:33.891786    2876 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt with IP's: []
	I0831 15:29:34.044423    2876 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt ...
	I0831 15:29:34.044439    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt: {Name:mkff87193f625d157d1a4f89b0da256c90604083 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.044784    2876 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key ...
	I0831 15:29:34.044793    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key: {Name:mke1833d9b208b07a8ff6dd57d320eb167de83a3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.045031    2876 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.72b12f93
	I0831 15:29:34.045046    2876 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.72b12f93 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.254]
	I0831 15:29:34.207099    2876 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.72b12f93 ...
	I0831 15:29:34.207118    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.72b12f93: {Name:mk38f2742462440beada92d4e254471d0fe85db9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.207433    2876 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.72b12f93 ...
	I0831 15:29:34.207443    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.72b12f93: {Name:mk29a130e2c97d3f060f247819d7c01c723a8502 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.207661    2876 certs.go:381] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.72b12f93 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt
	I0831 15:29:34.207842    2876 certs.go:385] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.72b12f93 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key
	I0831 15:29:34.208036    2876 certs.go:363] generating signed profile cert for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:29:34.208050    2876 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt with IP's: []
	I0831 15:29:34.314095    2876 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt ...
	I0831 15:29:34.314111    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt: {Name:mk708e4939e774d52c9a7d3335e0202d13493538 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.314481    2876 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key ...
	I0831 15:29:34.314489    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key: {Name:mkcfbb0611781f7e5640984b0a9cc91976dc5482 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.314700    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:29:34.314732    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:29:34.314751    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:29:34.314769    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:29:34.314787    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:29:34.314811    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:29:34.314831    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:29:34.314850    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:29:34.314947    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:29:34.314997    2876 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:29:34.315005    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:29:34.315034    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:29:34.315062    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:29:34.315091    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:29:34.315155    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:29:34.315187    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:29:34.315211    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:29:34.315229    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:29:34.315668    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:29:34.335288    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:29:34.355233    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:29:34.374357    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:29:34.393538    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I0831 15:29:34.413840    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0831 15:29:34.433106    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:29:34.452816    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:29:34.472204    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:29:34.492102    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:29:34.512126    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:29:34.530945    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0831 15:29:34.546877    2876 ssh_runner.go:195] Run: openssl version
	I0831 15:29:34.551681    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:29:34.565047    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:29:34.568688    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:29:34.568737    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:29:34.573250    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:29:34.587250    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:29:34.595871    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:29:34.599208    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:29:34.599248    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:29:34.603521    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:29:34.611689    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:29:34.620193    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:29:34.624378    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:29:34.624428    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:29:34.628785    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:29:34.637154    2876 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:29:34.640263    2876 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 15:29:34.640305    2876 kubeadm.go:392] StartCluster: {Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 C
lusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountTy
pe:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:29:34.640393    2876 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0831 15:29:34.652254    2876 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0831 15:29:34.660013    2876 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0831 15:29:34.668312    2876 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0831 15:29:34.675860    2876 kubeadm.go:155] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0831 15:29:34.675868    2876 kubeadm.go:157] found existing configuration files:
	
	I0831 15:29:34.675907    2876 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0831 15:29:34.683169    2876 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I0831 15:29:34.683212    2876 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I0831 15:29:34.690543    2876 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0831 15:29:34.697493    2876 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0831 15:29:34.697539    2876 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I0831 15:29:34.704850    2876 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0831 15:29:34.712593    2876 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0831 15:29:34.712643    2876 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0831 15:29:34.720047    2876 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0831 15:29:34.727239    2876 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0831 15:29:34.727279    2876 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0831 15:29:34.734575    2876 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem"
	I0831 15:29:34.806234    2876 kubeadm.go:310] [init] Using Kubernetes version: v1.31.0
	I0831 15:29:34.806318    2876 kubeadm.go:310] [preflight] Running pre-flight checks
	I0831 15:29:34.880330    2876 kubeadm.go:310] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0831 15:29:34.880424    2876 kubeadm.go:310] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0831 15:29:34.880492    2876 kubeadm.go:310] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I0831 15:29:34.888288    2876 kubeadm.go:310] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0831 15:29:34.931799    2876 out.go:235]   - Generating certificates and keys ...
	I0831 15:29:34.931855    2876 kubeadm.go:310] [certs] Using existing ca certificate authority
	I0831 15:29:34.931917    2876 kubeadm.go:310] [certs] Using existing apiserver certificate and key on disk
	I0831 15:29:35.094247    2876 kubeadm.go:310] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0831 15:29:35.242021    2876 kubeadm.go:310] [certs] Generating "front-proxy-ca" certificate and key
	I0831 15:29:35.553368    2876 kubeadm.go:310] [certs] Generating "front-proxy-client" certificate and key
	I0831 15:29:35.874778    2876 kubeadm.go:310] [certs] Generating "etcd/ca" certificate and key
	I0831 15:29:36.045823    2876 kubeadm.go:310] [certs] Generating "etcd/server" certificate and key
	I0831 15:29:36.046072    2876 kubeadm.go:310] [certs] etcd/server serving cert is signed for DNS names [ha-949000 localhost] and IPs [192.169.0.5 127.0.0.1 ::1]
	I0831 15:29:36.253528    2876 kubeadm.go:310] [certs] Generating "etcd/peer" certificate and key
	I0831 15:29:36.253651    2876 kubeadm.go:310] [certs] etcd/peer serving cert is signed for DNS names [ha-949000 localhost] and IPs [192.169.0.5 127.0.0.1 ::1]
	I0831 15:29:36.362185    2876 kubeadm.go:310] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0831 15:29:36.481613    2876 kubeadm.go:310] [certs] Generating "apiserver-etcd-client" certificate and key
	I0831 15:29:36.595099    2876 kubeadm.go:310] [certs] Generating "sa" key and public key
	I0831 15:29:36.595231    2876 kubeadm.go:310] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0831 15:29:36.687364    2876 kubeadm.go:310] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0831 15:29:36.786350    2876 kubeadm.go:310] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I0831 15:29:36.838505    2876 kubeadm.go:310] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0831 15:29:37.183406    2876 kubeadm.go:310] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0831 15:29:37.330529    2876 kubeadm.go:310] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0831 15:29:37.331123    2876 kubeadm.go:310] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0831 15:29:37.332869    2876 kubeadm.go:310] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0831 15:29:37.354639    2876 out.go:235]   - Booting up control plane ...
	I0831 15:29:37.354715    2876 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0831 15:29:37.354798    2876 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0831 15:29:37.354856    2876 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0831 15:29:37.354940    2876 kubeadm.go:310] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0831 15:29:37.355015    2876 kubeadm.go:310] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0831 15:29:37.355046    2876 kubeadm.go:310] [kubelet-start] Starting the kubelet
	I0831 15:29:37.462381    2876 kubeadm.go:310] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I0831 15:29:37.462478    2876 kubeadm.go:310] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I0831 15:29:37.972217    2876 kubeadm.go:310] [kubelet-check] The kubelet is healthy after 510.286911ms
	I0831 15:29:37.972306    2876 kubeadm.go:310] [api-check] Waiting for a healthy API server. This can take up to 4m0s
	I0831 15:29:43.988604    2876 kubeadm.go:310] [api-check] The API server is healthy after 6.020603512s
	I0831 15:29:44.000520    2876 kubeadm.go:310] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0831 15:29:44.008573    2876 kubeadm.go:310] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0831 15:29:44.022134    2876 kubeadm.go:310] [upload-certs] Skipping phase. Please see --upload-certs
	I0831 15:29:44.022318    2876 kubeadm.go:310] [mark-control-plane] Marking the node ha-949000 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0831 15:29:44.029102    2876 kubeadm.go:310] [bootstrap-token] Using token: zw6kb9.o9r4potygin4i7x2
	I0831 15:29:44.050780    2876 out.go:235]   - Configuring RBAC rules ...
	I0831 15:29:44.050942    2876 kubeadm.go:310] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0831 15:29:44.094287    2876 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0831 15:29:44.099052    2876 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0831 15:29:44.101377    2876 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0831 15:29:44.103328    2876 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0831 15:29:44.105426    2876 kubeadm.go:310] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0831 15:29:44.395210    2876 kubeadm.go:310] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0831 15:29:44.821705    2876 kubeadm.go:310] [addons] Applied essential addon: CoreDNS
	I0831 15:29:45.395130    2876 kubeadm.go:310] [addons] Applied essential addon: kube-proxy
	I0831 15:29:45.396108    2876 kubeadm.go:310] 
	I0831 15:29:45.396158    2876 kubeadm.go:310] Your Kubernetes control-plane has initialized successfully!
	I0831 15:29:45.396163    2876 kubeadm.go:310] 
	I0831 15:29:45.396236    2876 kubeadm.go:310] To start using your cluster, you need to run the following as a regular user:
	I0831 15:29:45.396245    2876 kubeadm.go:310] 
	I0831 15:29:45.396264    2876 kubeadm.go:310]   mkdir -p $HOME/.kube
	I0831 15:29:45.396314    2876 kubeadm.go:310]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0831 15:29:45.396355    2876 kubeadm.go:310]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0831 15:29:45.396359    2876 kubeadm.go:310] 
	I0831 15:29:45.396397    2876 kubeadm.go:310] Alternatively, if you are the root user, you can run:
	I0831 15:29:45.396406    2876 kubeadm.go:310] 
	I0831 15:29:45.396453    2876 kubeadm.go:310]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0831 15:29:45.396458    2876 kubeadm.go:310] 
	I0831 15:29:45.396496    2876 kubeadm.go:310] You should now deploy a pod network to the cluster.
	I0831 15:29:45.396560    2876 kubeadm.go:310] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0831 15:29:45.396617    2876 kubeadm.go:310]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0831 15:29:45.396623    2876 kubeadm.go:310] 
	I0831 15:29:45.396691    2876 kubeadm.go:310] You can now join any number of control-plane nodes by copying certificate authorities
	I0831 15:29:45.396760    2876 kubeadm.go:310] and service account keys on each node and then running the following as root:
	I0831 15:29:45.396766    2876 kubeadm.go:310] 
	I0831 15:29:45.396839    2876 kubeadm.go:310]   kubeadm join control-plane.minikube.internal:8443 --token zw6kb9.o9r4potygin4i7x2 \
	I0831 15:29:45.396919    2876 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 \
	I0831 15:29:45.396939    2876 kubeadm.go:310] 	--control-plane 
	I0831 15:29:45.396943    2876 kubeadm.go:310] 
	I0831 15:29:45.397018    2876 kubeadm.go:310] Then you can join any number of worker nodes by running the following on each as root:
	I0831 15:29:45.397029    2876 kubeadm.go:310] 
	I0831 15:29:45.397093    2876 kubeadm.go:310] kubeadm join control-plane.minikube.internal:8443 --token zw6kb9.o9r4potygin4i7x2 \
	I0831 15:29:45.397173    2876 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 
	I0831 15:29:45.397526    2876 kubeadm.go:310] W0831 22:29:34.618825    1608 common.go:101] your configuration file uses a deprecated API spec: "kubeadm.k8s.io/v1beta3" (kind: "ClusterConfiguration"). Please use 'kubeadm config migrate --old-config old.yaml --new-config new.yaml', which will write the new, similar spec using a newer API version.
	I0831 15:29:45.397751    2876 kubeadm.go:310] W0831 22:29:34.619993    1608 common.go:101] your configuration file uses a deprecated API spec: "kubeadm.k8s.io/v1beta3" (kind: "InitConfiguration"). Please use 'kubeadm config migrate --old-config old.yaml --new-config new.yaml', which will write the new, similar spec using a newer API version.
	I0831 15:29:45.397847    2876 kubeadm.go:310] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0831 15:29:45.397857    2876 cni.go:84] Creating CNI manager for ""
	I0831 15:29:45.397874    2876 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0831 15:29:45.420531    2876 out.go:177] * Configuring CNI (Container Networking Interface) ...
	I0831 15:29:45.477445    2876 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I0831 15:29:45.482633    2876 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.31.0/kubectl ...
	I0831 15:29:45.482643    2876 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2601 bytes)
	I0831 15:29:45.498168    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I0831 15:29:45.749965    2876 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0831 15:29:45.750050    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-949000 minikube.k8s.io/updated_at=2024_08_31T15_29_45_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2 minikube.k8s.io/name=ha-949000 minikube.k8s.io/primary=true
	I0831 15:29:45.750061    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:45.882304    2876 ops.go:34] apiserver oom_adj: -16
	I0831 15:29:45.896818    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:46.398021    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:46.897815    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:47.397274    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:47.897049    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:48.397593    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:48.462357    2876 kubeadm.go:1113] duration metric: took 2.712335704s to wait for elevateKubeSystemPrivileges
	I0831 15:29:48.462374    2876 kubeadm.go:394] duration metric: took 13.821875392s to StartCluster
	I0831 15:29:48.462389    2876 settings.go:142] acquiring lock: {Name:mk4b1b0a7439feab82be8f6d66b4d3c4d11c9b5f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:48.462482    2876 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:29:48.462909    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/kubeconfig: {Name:mkc7259a3f17d77b84078e55eed4ed8b5d2486ca Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:48.463157    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0831 15:29:48.463168    2876 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:29:48.463181    2876 start.go:241] waiting for startup goroutines ...
	I0831 15:29:48.463194    2876 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0831 15:29:48.463223    2876 addons.go:69] Setting storage-provisioner=true in profile "ha-949000"
	I0831 15:29:48.463228    2876 addons.go:69] Setting default-storageclass=true in profile "ha-949000"
	I0831 15:29:48.463245    2876 addons.go:234] Setting addon storage-provisioner=true in "ha-949000"
	I0831 15:29:48.463250    2876 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "ha-949000"
	I0831 15:29:48.463260    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:29:48.463303    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:29:48.463512    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:48.463518    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:48.463528    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:48.463540    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:48.472681    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51052
	I0831 15:29:48.473013    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51054
	I0831 15:29:48.473095    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:48.473332    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:48.473451    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:48.473463    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:48.473652    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:48.473665    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:48.473689    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:48.473921    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:48.474101    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:29:48.474113    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:48.474145    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:48.474214    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:48.474299    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:48.476440    2876 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:29:48.476667    2876 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x48c7c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0831 15:29:48.477025    2876 cert_rotation.go:140] Starting client certificate rotation controller
	I0831 15:29:48.477197    2876 addons.go:234] Setting addon default-storageclass=true in "ha-949000"
	I0831 15:29:48.477218    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:29:48.477428    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:48.477442    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:48.483175    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51056
	I0831 15:29:48.483519    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:48.483886    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:48.483904    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:48.484146    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:48.484254    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:29:48.484334    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:48.484406    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:48.485343    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:48.485904    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51058
	I0831 15:29:48.486187    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:48.486486    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:48.486495    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:48.486696    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:48.487040    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:48.487078    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:48.495680    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51060
	I0831 15:29:48.496017    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:48.496360    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:48.496389    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:48.496611    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:48.496715    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:29:48.496791    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:48.496872    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:48.497794    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:48.497926    2876 addons.go:431] installing /etc/kubernetes/addons/storageclass.yaml
	I0831 15:29:48.497934    2876 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0831 15:29:48.497944    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:48.498021    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:48.498099    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:48.498200    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:48.498277    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:48.507200    2876 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0831 15:29:48.527696    2876 addons.go:431] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0831 15:29:48.527708    2876 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0831 15:29:48.527725    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:48.527878    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:48.527981    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:48.528082    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:48.528217    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:48.528370    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.169.0.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0831 15:29:48.564053    2876 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0831 15:29:48.586435    2876 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0831 15:29:48.827708    2876 start.go:971] {"host.minikube.internal": 192.169.0.1} host record injected into CoreDNS's ConfigMap
	I0831 15:29:48.827730    2876 main.go:141] libmachine: Making call to close driver server
	I0831 15:29:48.827739    2876 main.go:141] libmachine: (ha-949000) Calling .Close
	I0831 15:29:48.827907    2876 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:29:48.827916    2876 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:29:48.827922    2876 main.go:141] libmachine: Making call to close driver server
	I0831 15:29:48.827926    2876 main.go:141] libmachine: (ha-949000) Calling .Close
	I0831 15:29:48.828046    2876 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:29:48.828049    2876 main.go:141] libmachine: (ha-949000) DBG | Closing plugin on server side
	I0831 15:29:48.828058    2876 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:29:48.828113    2876 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I0831 15:29:48.828125    2876 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I0831 15:29:48.828210    2876 round_trippers.go:463] GET https://192.169.0.254:8443/apis/storage.k8s.io/v1/storageclasses
	I0831 15:29:48.828215    2876 round_trippers.go:469] Request Headers:
	I0831 15:29:48.828223    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:29:48.828227    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:29:48.833724    2876 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0831 15:29:48.834156    2876 round_trippers.go:463] PUT https://192.169.0.254:8443/apis/storage.k8s.io/v1/storageclasses/standard
	I0831 15:29:48.834163    2876 round_trippers.go:469] Request Headers:
	I0831 15:29:48.834169    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:29:48.834199    2876 round_trippers.go:473]     Content-Type: application/json
	I0831 15:29:48.834205    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:29:48.835718    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:29:48.835861    2876 main.go:141] libmachine: Making call to close driver server
	I0831 15:29:48.835876    2876 main.go:141] libmachine: (ha-949000) Calling .Close
	I0831 15:29:48.836028    2876 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:29:48.836037    2876 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:29:48.836048    2876 main.go:141] libmachine: (ha-949000) DBG | Closing plugin on server side
	I0831 15:29:49.019783    2876 main.go:141] libmachine: Making call to close driver server
	I0831 15:29:49.019796    2876 main.go:141] libmachine: (ha-949000) Calling .Close
	I0831 15:29:49.019979    2876 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:29:49.019989    2876 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:29:49.019994    2876 main.go:141] libmachine: Making call to close driver server
	I0831 15:29:49.019999    2876 main.go:141] libmachine: (ha-949000) Calling .Close
	I0831 15:29:49.019999    2876 main.go:141] libmachine: (ha-949000) DBG | Closing plugin on server side
	I0831 15:29:49.020151    2876 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:29:49.020153    2876 main.go:141] libmachine: (ha-949000) DBG | Closing plugin on server side
	I0831 15:29:49.020159    2876 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:29:49.059498    2876 out.go:177] * Enabled addons: default-storageclass, storage-provisioner
	I0831 15:29:49.117324    2876 addons.go:510] duration metric: took 654.121351ms for enable addons: enabled=[default-storageclass storage-provisioner]
	I0831 15:29:49.117374    2876 start.go:246] waiting for cluster config update ...
	I0831 15:29:49.117390    2876 start.go:255] writing updated cluster config ...
	I0831 15:29:49.155430    2876 out.go:201] 
	I0831 15:29:49.192527    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:29:49.192625    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:29:49.214378    2876 out.go:177] * Starting "ha-949000-m02" control-plane node in "ha-949000" cluster
	I0831 15:29:49.272137    2876 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:29:49.272171    2876 cache.go:56] Caching tarball of preloaded images
	I0831 15:29:49.272338    2876 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:29:49.272356    2876 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:29:49.272445    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:29:49.273113    2876 start.go:360] acquireMachinesLock for ha-949000-m02: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:29:49.273204    2876 start.go:364] duration metric: took 68.322µs to acquireMachinesLock for "ha-949000-m02"
	I0831 15:29:49.273234    2876 start.go:93] Provisioning new machine with config: &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks
:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m02 IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:29:49.273329    2876 start.go:125] createHost starting for "m02" (driver="hyperkit")
	I0831 15:29:49.296266    2876 out.go:235] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0831 15:29:49.296429    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:49.296488    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:49.306391    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51065
	I0831 15:29:49.306732    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:49.307039    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:49.307051    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:49.307254    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:49.307374    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:29:49.307457    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:29:49.307559    2876 start.go:159] libmachine.API.Create for "ha-949000" (driver="hyperkit")
	I0831 15:29:49.307576    2876 client.go:168] LocalClient.Create starting
	I0831 15:29:49.307604    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem
	I0831 15:29:49.307643    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:29:49.307655    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:29:49.307696    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem
	I0831 15:29:49.307726    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:29:49.307735    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:29:49.307749    2876 main.go:141] libmachine: Running pre-create checks...
	I0831 15:29:49.307754    2876 main.go:141] libmachine: (ha-949000-m02) Calling .PreCreateCheck
	I0831 15:29:49.307836    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:49.307906    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetConfigRaw
	I0831 15:29:49.333695    2876 main.go:141] libmachine: Creating machine...
	I0831 15:29:49.333716    2876 main.go:141] libmachine: (ha-949000-m02) Calling .Create
	I0831 15:29:49.333916    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:49.334092    2876 main.go:141] libmachine: (ha-949000-m02) DBG | I0831 15:29:49.333909    2898 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:29:49.334195    2876 main.go:141] libmachine: (ha-949000-m02) Downloading /Users/jenkins/minikube-integration/18943-957/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso...
	I0831 15:29:49.534537    2876 main.go:141] libmachine: (ha-949000-m02) DBG | I0831 15:29:49.534440    2898 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa...
	I0831 15:29:49.629999    2876 main.go:141] libmachine: (ha-949000-m02) DBG | I0831 15:29:49.629917    2898 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk...
	I0831 15:29:49.630021    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Writing magic tar header
	I0831 15:29:49.630031    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Writing SSH key tar header
	I0831 15:29:49.630578    2876 main.go:141] libmachine: (ha-949000-m02) DBG | I0831 15:29:49.630526    2898 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02 ...
	I0831 15:29:49.986563    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:49.986593    2876 main.go:141] libmachine: (ha-949000-m02) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid
	I0831 15:29:49.986663    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Using UUID 23e5d675-5201-4f3d-86b7-b25c818528d1
	I0831 15:29:50.021467    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Generated MAC 92:7:3c:3f:ee:b7
	I0831 15:29:50.021484    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:29:50.021548    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"23e5d675-5201-4f3d-86b7-b25c818528d1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:29:50.021582    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"23e5d675-5201-4f3d-86b7-b25c818528d1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:29:50.021623    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "23e5d675-5201-4f3d-86b7-b25c818528d1", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:29:50.021665    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 23e5d675-5201-4f3d-86b7-b25c818528d1 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:29:50.021684    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:29:50.024624    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: Pid is 2899
	I0831 15:29:50.025044    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 0
	I0831 15:29:50.025058    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:50.025119    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:29:50.026207    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:29:50.026276    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:50.026305    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:50.026350    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:50.026373    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:50.026416    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:50.032754    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:29:50.041001    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:29:50.041896    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:29:50.041918    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:29:50.041929    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:29:50.041946    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:29:50.432260    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:29:50.432276    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:29:50.547071    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:29:50.547090    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:29:50.547112    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:29:50.547127    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:29:50.547965    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:29:50.547973    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:29:52.027270    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 1
	I0831 15:29:52.027288    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:52.027415    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:29:52.028177    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:29:52.028225    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:52.028236    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:52.028247    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:52.028254    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:52.028263    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:54.029110    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 2
	I0831 15:29:54.029126    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:54.029231    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:29:54.029999    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:29:54.030057    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:54.030075    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:54.030087    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:54.030095    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:54.030103    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:56.031274    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 3
	I0831 15:29:56.031292    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:56.031369    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:29:56.032155    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:29:56.032168    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:56.032178    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:56.032196    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:56.032213    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:56.032224    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:56.132338    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:56 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:29:56.132386    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:56 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:29:56.132396    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:56 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:29:56.155372    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:56 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:29:58.032308    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 4
	I0831 15:29:58.032325    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:58.032424    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:29:58.033214    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:29:58.033247    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:58.033259    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:58.033269    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:58.033278    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:58.033287    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:30:00.033449    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 5
	I0831 15:30:00.033465    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:30:00.033544    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:30:00.034313    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:30:00.034404    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:30:00.034418    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:30:00.034426    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found match: 92:7:3c:3f:ee:b7
	I0831 15:30:00.034433    2876 main.go:141] libmachine: (ha-949000-m02) DBG | IP: 192.169.0.6
	I0831 15:30:00.034475    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetConfigRaw
	I0831 15:30:00.035147    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:00.035249    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:00.035348    2876 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0831 15:30:00.035357    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:30:00.035434    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:30:00.035493    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:30:00.036274    2876 main.go:141] libmachine: Detecting operating system of created instance...
	I0831 15:30:00.036284    2876 main.go:141] libmachine: Waiting for SSH to be available...
	I0831 15:30:00.036289    2876 main.go:141] libmachine: Getting to WaitForSSH function...
	I0831 15:30:00.036293    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:00.036398    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:00.036485    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:00.036575    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:00.036655    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:00.036771    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:00.036969    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:00.036976    2876 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0831 15:30:01.059248    2876 main.go:141] libmachine: Error dialing TCP: ssh: handshake failed: ssh: unable to authenticate, attempted methods [none publickey], no supported methods remain
	I0831 15:30:04.124333    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:30:04.124345    2876 main.go:141] libmachine: Detecting the provisioner...
	I0831 15:30:04.124351    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.124488    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.124590    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.124683    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.124778    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.124921    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.125101    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.125110    2876 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0831 15:30:04.190272    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0831 15:30:04.190323    2876 main.go:141] libmachine: found compatible host: buildroot
	I0831 15:30:04.190329    2876 main.go:141] libmachine: Provisioning with buildroot...
	I0831 15:30:04.190334    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:30:04.190465    2876 buildroot.go:166] provisioning hostname "ha-949000-m02"
	I0831 15:30:04.190476    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:30:04.190558    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.190652    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.190763    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.190844    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.190943    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.191068    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.191204    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.191213    2876 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m02 && echo "ha-949000-m02" | sudo tee /etc/hostname
	I0831 15:30:04.267934    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m02
	
	I0831 15:30:04.267948    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.268081    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.268202    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.268299    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.268391    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.268525    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.268665    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.268684    2876 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:30:04.340314    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:30:04.340330    2876 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:30:04.340340    2876 buildroot.go:174] setting up certificates
	I0831 15:30:04.340346    2876 provision.go:84] configureAuth start
	I0831 15:30:04.340353    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:30:04.340483    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:30:04.340577    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.340665    2876 provision.go:143] copyHostCerts
	I0831 15:30:04.340691    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:30:04.340751    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:30:04.340757    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:30:04.340904    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:30:04.341121    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:30:04.341161    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:30:04.341166    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:30:04.341243    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:30:04.341390    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:30:04.341427    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:30:04.341432    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:30:04.341508    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:30:04.341670    2876 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m02 san=[127.0.0.1 192.169.0.6 ha-949000-m02 localhost minikube]
	I0831 15:30:04.509456    2876 provision.go:177] copyRemoteCerts
	I0831 15:30:04.509508    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:30:04.509523    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.509674    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.509762    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.509874    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.509973    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:30:04.550810    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:30:04.550883    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:30:04.571982    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:30:04.572058    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:30:04.592601    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:30:04.592680    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0831 15:30:04.612516    2876 provision.go:87] duration metric: took 272.157929ms to configureAuth
	I0831 15:30:04.612531    2876 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:30:04.612691    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:30:04.612706    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:04.612851    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.612970    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.613064    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.613150    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.613227    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.613345    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.613483    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.613491    2876 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:30:04.678333    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:30:04.678345    2876 buildroot.go:70] root file system type: tmpfs
	I0831 15:30:04.678436    2876 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:30:04.678450    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.678582    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.678669    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.678767    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.678846    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.678978    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.679124    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.679167    2876 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:30:04.756204    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:30:04.756224    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.756411    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.756527    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.756630    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.756734    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.756851    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.757006    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.757027    2876 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:30:06.370825    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:30:06.370840    2876 main.go:141] libmachine: Checking connection to Docker...
	I0831 15:30:06.370855    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetURL
	I0831 15:30:06.370996    2876 main.go:141] libmachine: Docker is up and running!
	I0831 15:30:06.371003    2876 main.go:141] libmachine: Reticulating splines...
	I0831 15:30:06.371008    2876 client.go:171] duration metric: took 17.063185858s to LocalClient.Create
	I0831 15:30:06.371017    2876 start.go:167] duration metric: took 17.063218984s to libmachine.API.Create "ha-949000"
	I0831 15:30:06.371023    2876 start.go:293] postStartSetup for "ha-949000-m02" (driver="hyperkit")
	I0831 15:30:06.371029    2876 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:30:06.371039    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:06.371176    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:30:06.371190    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:06.371279    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:06.371365    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:06.371448    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:06.371522    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:30:06.410272    2876 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:30:06.413456    2876 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:30:06.413467    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:30:06.413573    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:30:06.413753    2876 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:30:06.413762    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:30:06.413962    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:30:06.421045    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:30:06.440540    2876 start.go:296] duration metric: took 69.508758ms for postStartSetup
	I0831 15:30:06.440562    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetConfigRaw
	I0831 15:30:06.441179    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:30:06.441343    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:30:06.441726    2876 start.go:128] duration metric: took 17.168146238s to createHost
	I0831 15:30:06.441741    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:06.441826    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:06.441909    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:06.442008    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:06.442102    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:06.442220    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:06.442339    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:06.442346    2876 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:30:06.507669    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143406.563138986
	
	I0831 15:30:06.507682    2876 fix.go:216] guest clock: 1725143406.563138986
	I0831 15:30:06.507687    2876 fix.go:229] Guest: 2024-08-31 15:30:06.563138986 -0700 PDT Remote: 2024-08-31 15:30:06.441735 -0700 PDT m=+57.202103081 (delta=121.403986ms)
	I0831 15:30:06.507698    2876 fix.go:200] guest clock delta is within tolerance: 121.403986ms
	I0831 15:30:06.507701    2876 start.go:83] releasing machines lock for "ha-949000-m02", held for 17.234244881s
	I0831 15:30:06.507719    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:06.507845    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:30:06.534518    2876 out.go:177] * Found network options:
	I0831 15:30:06.585154    2876 out.go:177]   - NO_PROXY=192.169.0.5
	W0831 15:30:06.608372    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:30:06.608434    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:06.609377    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:06.609624    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:06.609725    2876 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:30:06.609763    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	W0831 15:30:06.609837    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:30:06.609978    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:06.609993    2876 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:30:06.610018    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:06.610265    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:06.610300    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:06.610460    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:06.610487    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:06.610621    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:30:06.610643    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:06.610806    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	W0831 15:30:06.649012    2876 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:30:06.649075    2876 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:30:06.693849    2876 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:30:06.693863    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:30:06.693938    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:30:06.709316    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:30:06.718380    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:30:06.727543    2876 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:30:06.727609    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:30:06.736698    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:30:06.745615    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:30:06.755140    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:30:06.764398    2876 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:30:06.773464    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:30:06.782661    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:30:06.791918    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:30:06.801132    2876 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:30:06.809259    2876 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:30:06.817528    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:06.918051    2876 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:30:06.937658    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:30:06.937726    2876 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:30:06.952225    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:30:06.964364    2876 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:30:06.981641    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:30:06.992676    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:30:07.003746    2876 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:30:07.061399    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:30:07.071765    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:30:07.086915    2876 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:30:07.089960    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:30:07.097339    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:30:07.110902    2876 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:30:07.218878    2876 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:30:07.327438    2876 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:30:07.327478    2876 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:30:07.343077    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:07.455166    2876 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:30:09.753051    2876 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.297833346s)
	I0831 15:30:09.753112    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:30:09.763410    2876 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:30:09.776197    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:30:09.788015    2876 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:30:09.886287    2876 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:30:09.979666    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:10.091986    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:30:10.105474    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:30:10.116526    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:10.223654    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:30:10.284365    2876 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:30:10.284447    2876 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:30:10.288841    2876 start.go:563] Will wait 60s for crictl version
	I0831 15:30:10.288894    2876 ssh_runner.go:195] Run: which crictl
	I0831 15:30:10.292674    2876 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:30:10.327492    2876 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:30:10.327571    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:30:10.348428    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:30:10.394804    2876 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:30:10.438643    2876 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:30:10.460438    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:30:10.460677    2876 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:30:10.463911    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:30:10.474227    2876 mustload.go:65] Loading cluster: ha-949000
	I0831 15:30:10.474382    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:30:10.474620    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:30:10.474636    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:30:10.483465    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51091
	I0831 15:30:10.483852    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:30:10.484170    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:30:10.484182    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:30:10.484380    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:30:10.484504    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:30:10.484591    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:30:10.484661    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:30:10.485631    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:30:10.485888    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:30:10.485912    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:30:10.494468    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51093
	I0831 15:30:10.494924    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:30:10.495238    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:30:10.495250    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:30:10.495476    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:30:10.495585    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:30:10.495693    2876 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.6
	I0831 15:30:10.495700    2876 certs.go:194] generating shared ca certs ...
	I0831 15:30:10.495711    2876 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:30:10.495883    2876 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:30:10.495953    2876 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:30:10.495961    2876 certs.go:256] generating profile certs ...
	I0831 15:30:10.496069    2876 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:30:10.496092    2876 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.2cd83952
	I0831 15:30:10.496104    2876 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.2cd83952 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.254]
	I0831 15:30:10.585710    2876 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.2cd83952 ...
	I0831 15:30:10.585732    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.2cd83952: {Name:mkfd98043f041b827744dcc9a0bc27d9f7ba3a8d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:30:10.586080    2876 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.2cd83952 ...
	I0831 15:30:10.586093    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.2cd83952: {Name:mk6025bd0561394827636d384e273ec532f21510 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:30:10.586307    2876 certs.go:381] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.2cd83952 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt
	I0831 15:30:10.586527    2876 certs.go:385] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.2cd83952 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key
	I0831 15:30:10.586791    2876 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:30:10.586800    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:30:10.586823    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:30:10.586842    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:30:10.586860    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:30:10.586879    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:30:10.586902    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:30:10.586921    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:30:10.586939    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:30:10.587027    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:30:10.587073    2876 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:30:10.587082    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:30:10.587115    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:30:10.587145    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:30:10.587174    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:30:10.587237    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:30:10.587271    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:30:10.587293    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:30:10.587312    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:30:10.587343    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:30:10.587493    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:30:10.587598    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:30:10.587689    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:30:10.587790    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:30:10.619319    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0831 15:30:10.622586    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0831 15:30:10.631798    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0831 15:30:10.634863    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0831 15:30:10.644806    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0831 15:30:10.648392    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0831 15:30:10.657224    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0831 15:30:10.660506    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1679 bytes)
	I0831 15:30:10.668998    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0831 15:30:10.672282    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0831 15:30:10.681734    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0831 15:30:10.685037    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0831 15:30:10.697579    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:30:10.717100    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:30:10.736755    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:30:10.757074    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:30:10.776635    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I0831 15:30:10.796052    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0831 15:30:10.815309    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:30:10.834549    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:30:10.854663    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:30:10.873734    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:30:10.892872    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:30:10.912223    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0831 15:30:10.925669    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0831 15:30:10.939310    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0831 15:30:10.952723    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1679 bytes)
	I0831 15:30:10.966203    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0831 15:30:10.980670    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0831 15:30:10.994195    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0831 15:30:11.007818    2876 ssh_runner.go:195] Run: openssl version
	I0831 15:30:11.012076    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:30:11.021306    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:30:11.024674    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:30:11.024710    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:30:11.028962    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:30:11.038172    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:30:11.048226    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:30:11.051704    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:30:11.051746    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:30:11.056026    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:30:11.065281    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:30:11.074586    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:30:11.077977    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:30:11.078018    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:30:11.082263    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:30:11.091560    2876 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:30:11.094606    2876 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 15:30:11.094641    2876 kubeadm.go:934] updating node {m02 192.169.0.6 8443 v1.31.0 docker true true} ...
	I0831 15:30:11.094696    2876 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.6
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:30:11.094712    2876 kube-vip.go:115] generating kube-vip config ...
	I0831 15:30:11.094743    2876 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:30:11.107306    2876 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:30:11.107348    2876 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:30:11.107400    2876 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:30:11.116476    2876 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.31.0: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.31.0': No such file or directory
	
	Initiating transfer...
	I0831 15:30:11.116538    2876 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.31.0
	I0831 15:30:11.125199    2876 download.go:107] Downloading: https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubelet.sha256 -> /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet
	I0831 15:30:11.125199    2876 download.go:107] Downloading: https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl.sha256 -> /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl
	I0831 15:30:11.125202    2876 download.go:107] Downloading: https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubeadm.sha256 -> /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm
	I0831 15:30:13.495982    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl -> /var/lib/minikube/binaries/v1.31.0/kubectl
	I0831 15:30:13.496079    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubectl
	I0831 15:30:13.499639    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubectl: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubectl': No such file or directory
	I0831 15:30:13.499660    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl --> /var/lib/minikube/binaries/v1.31.0/kubectl (56381592 bytes)
	I0831 15:30:14.245316    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm -> /var/lib/minikube/binaries/v1.31.0/kubeadm
	I0831 15:30:14.245403    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubeadm
	I0831 15:30:14.249019    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubeadm: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubeadm': No such file or directory
	I0831 15:30:14.249045    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm --> /var/lib/minikube/binaries/v1.31.0/kubeadm (58290328 bytes)
	I0831 15:30:14.305452    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:30:14.335903    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet -> /var/lib/minikube/binaries/v1.31.0/kubelet
	I0831 15:30:14.336035    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubelet
	I0831 15:30:14.348689    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubelet: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubelet': No such file or directory
	I0831 15:30:14.348746    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet --> /var/lib/minikube/binaries/v1.31.0/kubelet (76865848 bytes)
	I0831 15:30:14.608960    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0831 15:30:14.617331    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:30:14.630716    2876 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:30:14.643952    2876 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:30:14.657665    2876 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:30:14.660616    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:30:14.670825    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:14.766762    2876 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:30:14.782036    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:30:14.782341    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:30:14.782363    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:30:14.791218    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51120
	I0831 15:30:14.791554    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:30:14.791943    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:30:14.791962    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:30:14.792169    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:30:14.792281    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:30:14.792379    2876 start.go:317] joinCluster: &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 Clu
sterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpira
tion:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:30:14.792482    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm token create --print-join-command --ttl=0"
	I0831 15:30:14.792500    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:30:14.792589    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:30:14.792677    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:30:14.792804    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:30:14.792889    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:30:14.904364    2876 start.go:343] trying to join control-plane node "m02" to cluster: &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:30:14.904404    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm join control-plane.minikube.internal:8443 --token sa5gl8.nk4lqkhvqrn6uouk --discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-949000-m02 --control-plane --apiserver-advertise-address=192.169.0.6 --apiserver-bind-port=8443"
	I0831 15:30:43.067719    2876 ssh_runner.go:235] Completed: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm join control-plane.minikube.internal:8443 --token sa5gl8.nk4lqkhvqrn6uouk --discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-949000-m02 --control-plane --apiserver-advertise-address=192.169.0.6 --apiserver-bind-port=8443": (28.162893612s)
	I0831 15:30:43.067762    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo systemctl daemon-reload && sudo systemctl enable kubelet && sudo systemctl start kubelet"
	I0831 15:30:43.495593    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-949000-m02 minikube.k8s.io/updated_at=2024_08_31T15_30_43_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2 minikube.k8s.io/name=ha-949000 minikube.k8s.io/primary=false
	I0831 15:30:43.584878    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig taint nodes ha-949000-m02 node-role.kubernetes.io/control-plane:NoSchedule-
	I0831 15:30:43.672222    2876 start.go:319] duration metric: took 28.879433845s to joinCluster
	I0831 15:30:43.672264    2876 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:30:43.672464    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:30:43.696001    2876 out.go:177] * Verifying Kubernetes components...
	I0831 15:30:43.753664    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:43.969793    2876 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:30:43.995704    2876 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:30:43.995955    2876 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x48c7c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:30:43.995999    2876 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:30:43.996168    2876 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m02" to be "Ready" ...
	I0831 15:30:43.996224    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:43.996229    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:43.996246    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:43.996253    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:44.008886    2876 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0831 15:30:44.496443    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:44.496458    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:44.496465    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:44.496468    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:44.499732    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:44.996970    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:44.996984    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:44.996990    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:44.996993    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:45.000189    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:45.496917    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:45.496930    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:45.496936    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:45.496939    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:45.498866    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:30:45.996558    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:45.996579    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:45.996604    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:45.996626    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:45.999357    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:45.999667    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:46.496895    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:46.496907    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:46.496914    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:46.496917    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:46.499220    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:46.996382    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:46.996397    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:46.996403    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:46.996406    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:46.998788    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:47.497035    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:47.497048    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:47.497055    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:47.497059    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:47.499487    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:47.996662    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:47.996675    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:47.996695    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:47.996699    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:47.998935    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:48.496588    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:48.496603    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:48.496610    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:48.496613    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:48.498806    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:48.499160    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:48.996774    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:48.996800    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:48.996806    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:48.996810    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:48.998862    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:49.496728    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:49.496741    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:49.496748    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:49.496753    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:49.500270    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:49.996536    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:49.996548    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:49.996555    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:49.996560    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:49.998977    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:50.496423    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:50.496441    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:50.496452    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:50.496458    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:50.499488    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:50.499941    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:50.996502    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:50.996515    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:50.996520    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:50.996525    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:50.998339    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:30:51.496978    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:51.496999    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:51.497011    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:51.497018    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:51.499859    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:51.997186    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:51.997200    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:51.997207    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:51.997210    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:52.000228    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:52.498065    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:52.498084    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:52.498093    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:52.498097    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:52.500425    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:52.500868    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:52.996733    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:52.996786    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:52.996804    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:52.996819    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:52.999878    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:53.496732    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:53.496752    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:53.496764    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:53.496772    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:53.499723    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:53.996635    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:53.996698    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:53.996722    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:53.996730    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:54.000327    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:54.496855    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:54.496875    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:54.496883    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:54.496888    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:54.499247    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:54.996676    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:54.996692    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:54.996701    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:54.996706    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:54.999066    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:54.999477    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:55.496949    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:55.496960    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:55.496967    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:55.496971    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:55.499074    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:55.996611    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:55.996627    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:55.996644    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:55.996651    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:55.999061    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:56.497363    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:56.497376    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:56.497383    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:56.497386    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:56.499540    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:56.997791    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:56.997810    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:56.997822    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:56.997828    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:57.001116    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:57.001481    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:57.497843    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:57.497862    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:57.497874    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:57.497881    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:57.500770    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:57.998298    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:57.998324    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:57.998335    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:57.998344    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:58.002037    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:58.496643    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:58.496664    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:58.496677    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:58.496683    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:58.499466    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:58.997398    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:58.997468    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:58.997484    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:58.997490    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:59.000768    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:59.498644    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:59.498668    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:59.498680    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:59.498685    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:59.502573    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:59.503046    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:59.996689    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:59.996715    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:59.996765    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:59.996773    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:59.999409    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:00.496654    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:00.496668    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.496677    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.496681    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.498585    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.499019    2876 node_ready.go:49] node "ha-949000-m02" has status "Ready":"True"
	I0831 15:31:00.499031    2876 node_ready.go:38] duration metric: took 16.50261118s for node "ha-949000-m02" to be "Ready" ...
	I0831 15:31:00.499038    2876 pod_ready.go:36] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:31:00.499081    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:31:00.499087    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.499092    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.499095    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.502205    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:00.506845    2876 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.506892    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:31:00.506897    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.506903    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.506908    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.508659    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.509078    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:00.509085    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.509091    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.509094    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.510447    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.510831    2876 pod_ready.go:93] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:00.510839    2876 pod_ready.go:82] duration metric: took 3.983743ms for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.510852    2876 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.510887    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-snq8s
	I0831 15:31:00.510892    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.510897    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.510901    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.512274    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.512740    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:00.512747    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.512752    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.512757    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.514085    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.514446    2876 pod_ready.go:93] pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:00.514457    2876 pod_ready.go:82] duration metric: took 3.596287ms for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.514464    2876 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.514501    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000
	I0831 15:31:00.514506    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.514512    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.514515    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.517897    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:00.518307    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:00.518314    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.518320    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.518324    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.519756    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.520128    2876 pod_ready.go:93] pod "etcd-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:00.520138    2876 pod_ready.go:82] duration metric: took 5.668748ms for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.520144    2876 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.520177    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m02
	I0831 15:31:00.520182    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.520187    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.520191    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.521454    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.521852    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:00.521860    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.521865    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.521870    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.523054    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.523372    2876 pod_ready.go:93] pod "etcd-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:00.523381    2876 pod_ready.go:82] duration metric: took 3.231682ms for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.523393    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.698293    2876 request.go:632] Waited for 174.813181ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:31:00.698344    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:31:00.698420    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.698432    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.698439    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.701539    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:00.897673    2876 request.go:632] Waited for 195.424003ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:00.897783    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:00.897794    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.897805    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.897814    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.900981    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:00.901407    2876 pod_ready.go:93] pod "kube-apiserver-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:00.901419    2876 pod_ready.go:82] duration metric: took 378.015429ms for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.901429    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:01.097805    2876 request.go:632] Waited for 196.320526ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:31:01.097926    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:31:01.097936    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:01.097947    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:01.097955    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:01.100563    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:01.298122    2876 request.go:632] Waited for 197.162644ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:01.298157    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:01.298162    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:01.298168    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:01.298172    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:01.300402    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:01.300781    2876 pod_ready.go:93] pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:01.300791    2876 pod_ready.go:82] duration metric: took 399.34942ms for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:01.300807    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:01.497316    2876 request.go:632] Waited for 196.39746ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:31:01.497376    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:31:01.497387    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:01.497397    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:01.497405    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:01.500651    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:01.698231    2876 request.go:632] Waited for 196.759957ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:01.698322    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:01.698333    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:01.698344    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:01.698353    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:01.701256    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:01.701766    2876 pod_ready.go:93] pod "kube-controller-manager-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:01.701775    2876 pod_ready.go:82] duration metric: took 400.954779ms for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:01.701785    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:01.898783    2876 request.go:632] Waited for 196.946643ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:31:01.898903    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:31:01.898917    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:01.898929    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:01.898938    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:01.902347    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:02.097749    2876 request.go:632] Waited for 194.738931ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:02.097815    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:02.097824    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:02.097834    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:02.097843    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:02.101525    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:02.102016    2876 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:02.102028    2876 pod_ready.go:82] duration metric: took 400.230387ms for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:02.102037    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:02.296929    2876 request.go:632] Waited for 194.771963ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:31:02.296979    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:31:02.296996    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:02.297010    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:02.297016    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:02.300518    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:02.498356    2876 request.go:632] Waited for 197.140595ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:02.498409    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:02.498414    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:02.498421    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:02.498425    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:02.500151    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:02.500554    2876 pod_ready.go:93] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:02.500564    2876 pod_ready.go:82] duration metric: took 398.515508ms for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:02.500577    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:02.697756    2876 request.go:632] Waited for 197.121926ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:31:02.697847    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:31:02.697859    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:02.697871    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:02.697879    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:02.701227    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:02.896975    2876 request.go:632] Waited for 195.16614ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:02.897029    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:02.897044    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:02.897050    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:02.897054    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:02.899135    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:02.899494    2876 pod_ready.go:93] pod "kube-proxy-q7ndn" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:02.899504    2876 pod_ready.go:82] duration metric: took 398.915896ms for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:02.899511    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:03.098441    2876 request.go:632] Waited for 198.871316ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:31:03.098576    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:31:03.098587    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.098599    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.098606    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.101995    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:03.297740    2876 request.go:632] Waited for 194.927579ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:03.297801    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:03.297842    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.297855    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.297863    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.300956    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:03.301560    2876 pod_ready.go:93] pod "kube-scheduler-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:03.301572    2876 pod_ready.go:82] duration metric: took 402.049602ms for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:03.301580    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:03.498380    2876 request.go:632] Waited for 196.707011ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:31:03.498472    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:31:03.498482    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.498494    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.498505    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.502174    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:03.696864    2876 request.go:632] Waited for 194.200989ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:03.696916    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:03.696926    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.696938    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.696944    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.700327    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:03.700769    2876 pod_ready.go:93] pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:03.700782    2876 pod_ready.go:82] duration metric: took 399.189338ms for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:03.700791    2876 pod_ready.go:39] duration metric: took 3.201699285s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:31:03.700816    2876 api_server.go:52] waiting for apiserver process to appear ...
	I0831 15:31:03.700877    2876 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:31:03.712528    2876 api_server.go:72] duration metric: took 20.039964419s to wait for apiserver process to appear ...
	I0831 15:31:03.712539    2876 api_server.go:88] waiting for apiserver healthz status ...
	I0831 15:31:03.712554    2876 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0831 15:31:03.715722    2876 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0831 15:31:03.715760    2876 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0831 15:31:03.715765    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.715771    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.715775    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.716371    2876 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 15:31:03.716424    2876 api_server.go:141] control plane version: v1.31.0
	I0831 15:31:03.716433    2876 api_server.go:131] duration metric: took 3.890107ms to wait for apiserver health ...
	I0831 15:31:03.716440    2876 system_pods.go:43] waiting for kube-system pods to appear ...
	I0831 15:31:03.898331    2876 request.go:632] Waited for 181.827666ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:31:03.898385    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:31:03.898446    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.898465    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.898473    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.903436    2876 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:31:03.906746    2876 system_pods.go:59] 17 kube-system pods found
	I0831 15:31:03.906767    2876 system_pods.go:61] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:31:03.906771    2876 system_pods.go:61] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:31:03.906775    2876 system_pods.go:61] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:31:03.906778    2876 system_pods.go:61] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:31:03.906783    2876 system_pods.go:61] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:31:03.906786    2876 system_pods.go:61] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:31:03.906789    2876 system_pods.go:61] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:31:03.906793    2876 system_pods.go:61] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:31:03.906796    2876 system_pods.go:61] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:31:03.906799    2876 system_pods.go:61] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:31:03.906802    2876 system_pods.go:61] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:31:03.906805    2876 system_pods.go:61] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:31:03.906810    2876 system_pods.go:61] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:31:03.906814    2876 system_pods.go:61] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:31:03.906816    2876 system_pods.go:61] "kube-vip-ha-949000" [933b8e54-299e-44c1-8dea-69aba92adbd4] Running
	I0831 15:31:03.906819    2876 system_pods.go:61] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:31:03.906824    2876 system_pods.go:61] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:31:03.906830    2876 system_pods.go:74] duration metric: took 190.381994ms to wait for pod list to return data ...
	I0831 15:31:03.906835    2876 default_sa.go:34] waiting for default service account to be created ...
	I0831 15:31:04.096833    2876 request.go:632] Waited for 189.933385ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:31:04.096919    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:31:04.096929    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:04.096940    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:04.096947    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:04.100750    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:04.100942    2876 default_sa.go:45] found service account: "default"
	I0831 15:31:04.100955    2876 default_sa.go:55] duration metric: took 194.103228ms for default service account to be created ...
	I0831 15:31:04.100963    2876 system_pods.go:116] waiting for k8s-apps to be running ...
	I0831 15:31:04.297283    2876 request.go:632] Waited for 196.269925ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:31:04.297349    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:31:04.297359    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:04.297370    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:04.297380    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:04.301594    2876 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:31:04.305403    2876 system_pods.go:86] 17 kube-system pods found
	I0831 15:31:04.305414    2876 system_pods.go:89] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:31:04.305418    2876 system_pods.go:89] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:31:04.305421    2876 system_pods.go:89] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:31:04.305424    2876 system_pods.go:89] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:31:04.305427    2876 system_pods.go:89] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:31:04.305431    2876 system_pods.go:89] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:31:04.305434    2876 system_pods.go:89] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:31:04.305438    2876 system_pods.go:89] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:31:04.305440    2876 system_pods.go:89] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:31:04.305443    2876 system_pods.go:89] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:31:04.305446    2876 system_pods.go:89] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:31:04.305449    2876 system_pods.go:89] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:31:04.305452    2876 system_pods.go:89] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:31:04.305455    2876 system_pods.go:89] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:31:04.305457    2876 system_pods.go:89] "kube-vip-ha-949000" [933b8e54-299e-44c1-8dea-69aba92adbd4] Running
	I0831 15:31:04.305459    2876 system_pods.go:89] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:31:04.305462    2876 system_pods.go:89] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:31:04.305467    2876 system_pods.go:126] duration metric: took 204.496865ms to wait for k8s-apps to be running ...
	I0831 15:31:04.305472    2876 system_svc.go:44] waiting for kubelet service to be running ....
	I0831 15:31:04.305532    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:31:04.316332    2876 system_svc.go:56] duration metric: took 10.855844ms WaitForService to wait for kubelet
	I0831 15:31:04.316347    2876 kubeadm.go:582] duration metric: took 20.643776408s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:31:04.316359    2876 node_conditions.go:102] verifying NodePressure condition ...
	I0831 15:31:04.497360    2876 request.go:632] Waited for 180.939277ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0831 15:31:04.497396    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0831 15:31:04.497400    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:04.497406    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:04.497409    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:04.500112    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:04.500615    2876 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:31:04.500630    2876 node_conditions.go:123] node cpu capacity is 2
	I0831 15:31:04.500640    2876 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:31:04.500644    2876 node_conditions.go:123] node cpu capacity is 2
	I0831 15:31:04.500647    2876 node_conditions.go:105] duration metric: took 184.28246ms to run NodePressure ...
	I0831 15:31:04.500655    2876 start.go:241] waiting for startup goroutines ...
	I0831 15:31:04.500673    2876 start.go:255] writing updated cluster config ...
	I0831 15:31:04.522012    2876 out.go:201] 
	I0831 15:31:04.543188    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:31:04.543261    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:31:04.565062    2876 out.go:177] * Starting "ha-949000-m03" control-plane node in "ha-949000" cluster
	I0831 15:31:04.608029    2876 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:31:04.608097    2876 cache.go:56] Caching tarball of preloaded images
	I0831 15:31:04.608326    2876 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:31:04.608349    2876 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:31:04.608480    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:31:04.609474    2876 start.go:360] acquireMachinesLock for ha-949000-m03: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:31:04.609608    2876 start.go:364] duration metric: took 107.158µs to acquireMachinesLock for "ha-949000-m03"
	I0831 15:31:04.609644    2876 start.go:93] Provisioning new machine with config: &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ing
ress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror:
DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m03 IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:31:04.609770    2876 start.go:125] createHost starting for "m03" (driver="hyperkit")
	I0831 15:31:04.631012    2876 out.go:235] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0831 15:31:04.631142    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:31:04.631178    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:31:04.640831    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51128
	I0831 15:31:04.641212    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:31:04.641538    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:31:04.641551    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:31:04.641754    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:31:04.641864    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:31:04.641951    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:04.642054    2876 start.go:159] libmachine.API.Create for "ha-949000" (driver="hyperkit")
	I0831 15:31:04.642071    2876 client.go:168] LocalClient.Create starting
	I0831 15:31:04.642111    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem
	I0831 15:31:04.642169    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:31:04.642179    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:31:04.642217    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem
	I0831 15:31:04.642255    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:31:04.642264    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:31:04.642276    2876 main.go:141] libmachine: Running pre-create checks...
	I0831 15:31:04.642281    2876 main.go:141] libmachine: (ha-949000-m03) Calling .PreCreateCheck
	I0831 15:31:04.642379    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:04.642422    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetConfigRaw
	I0831 15:31:04.652222    2876 main.go:141] libmachine: Creating machine...
	I0831 15:31:04.652235    2876 main.go:141] libmachine: (ha-949000-m03) Calling .Create
	I0831 15:31:04.652380    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:04.652531    2876 main.go:141] libmachine: (ha-949000-m03) DBG | I0831 15:31:04.652372    3223 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:31:04.652595    2876 main.go:141] libmachine: (ha-949000-m03) Downloading /Users/jenkins/minikube-integration/18943-957/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso...
	I0831 15:31:04.967913    2876 main.go:141] libmachine: (ha-949000-m03) DBG | I0831 15:31:04.967796    3223 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa...
	I0831 15:31:05.218214    2876 main.go:141] libmachine: (ha-949000-m03) DBG | I0831 15:31:05.218148    3223 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/ha-949000-m03.rawdisk...
	I0831 15:31:05.218234    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Writing magic tar header
	I0831 15:31:05.218243    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Writing SSH key tar header
	I0831 15:31:05.219245    2876 main.go:141] libmachine: (ha-949000-m03) DBG | I0831 15:31:05.219093    3223 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03 ...
	I0831 15:31:05.777334    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:05.777394    2876 main.go:141] libmachine: (ha-949000-m03) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid
	I0831 15:31:05.777478    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Using UUID 3fdefe95-7552-4d5b-8412-6ae6e5c787bb
	I0831 15:31:05.805053    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Generated MAC fa:59:9e:3b:35:6d
	I0831 15:31:05.805071    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:31:05.805106    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"3fdefe95-7552-4d5b-8412-6ae6e5c787bb", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00011a5d0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:31:05.805131    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"3fdefe95-7552-4d5b-8412-6ae6e5c787bb", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00011a5d0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:31:05.805226    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "3fdefe95-7552-4d5b-8412-6ae6e5c787bb", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/ha-949000-m03.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:31:05.805279    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 3fdefe95-7552-4d5b-8412-6ae6e5c787bb -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/ha-949000-m03.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:31:05.805308    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:31:05.808244    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: Pid is 3227
	I0831 15:31:05.808817    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 0
	I0831 15:31:05.808830    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:05.808902    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:05.809826    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:05.809929    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:31:05.809949    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:31:05.809975    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:31:05.809992    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:31:05.810004    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:31:05.810013    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:31:05.816053    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:31:05.824689    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:31:05.825475    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:31:05.825495    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:31:05.825508    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:31:05.825518    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:31:06.214670    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:31:06.214691    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:31:06.330054    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:31:06.330074    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:31:06.330102    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:31:06.330119    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:31:06.330929    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:31:06.330943    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:31:07.810124    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 1
	I0831 15:31:07.810138    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:07.810246    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:07.811007    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:07.811057    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:31:07.811067    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:31:07.811076    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:31:07.811082    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:31:07.811088    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:31:07.811097    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:31:09.811187    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 2
	I0831 15:31:09.811200    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:09.811312    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:09.812186    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:09.812196    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:31:09.812205    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:31:09.812213    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:31:09.812234    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:31:09.812241    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:31:09.812249    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:31:11.813365    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 3
	I0831 15:31:11.813388    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:11.813446    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:11.814261    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:11.814310    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:31:11.814328    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:31:11.814337    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:31:11.814342    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:31:11.814361    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:31:11.814371    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:31:11.957428    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:11 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:31:11.957483    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:11 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:31:11.957496    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:11 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:31:11.981309    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:11 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:31:13.815231    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 4
	I0831 15:31:13.815245    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:13.815334    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:13.816118    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:13.816176    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:31:13.816186    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:31:13.816194    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:31:13.816200    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:31:13.816208    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:31:13.816220    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:31:15.816252    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 5
	I0831 15:31:15.816273    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:15.816393    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:15.817241    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:15.817305    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 6 entries in /var/db/dhcpd_leases!
	I0831 15:31:15.817315    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d4eb32}
	I0831 15:31:15.817332    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found match: fa:59:9e:3b:35:6d
	I0831 15:31:15.817339    2876 main.go:141] libmachine: (ha-949000-m03) DBG | IP: 192.169.0.7
	I0831 15:31:15.817379    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetConfigRaw
	I0831 15:31:15.817997    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:15.818096    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:15.818188    2876 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0831 15:31:15.818195    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetState
	I0831 15:31:15.818279    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:15.818331    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:15.819115    2876 main.go:141] libmachine: Detecting operating system of created instance...
	I0831 15:31:15.819122    2876 main.go:141] libmachine: Waiting for SSH to be available...
	I0831 15:31:15.819126    2876 main.go:141] libmachine: Getting to WaitForSSH function...
	I0831 15:31:15.819130    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:15.819211    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:15.819288    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:15.819367    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:15.819433    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:15.819544    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:15.819737    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:15.819744    2876 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0831 15:31:16.864414    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:31:16.864428    2876 main.go:141] libmachine: Detecting the provisioner...
	I0831 15:31:16.864434    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:16.864597    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:16.864686    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.864782    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.864877    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:16.865009    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:16.865163    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:16.865170    2876 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0831 15:31:16.911810    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0831 15:31:16.911850    2876 main.go:141] libmachine: found compatible host: buildroot
	I0831 15:31:16.911857    2876 main.go:141] libmachine: Provisioning with buildroot...
	I0831 15:31:16.911862    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:31:16.911989    2876 buildroot.go:166] provisioning hostname "ha-949000-m03"
	I0831 15:31:16.911998    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:31:16.912088    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:16.912161    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:16.912247    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.912323    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.912399    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:16.912532    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:16.912676    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:16.912685    2876 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m03 && echo "ha-949000-m03" | sudo tee /etc/hostname
	I0831 15:31:16.972401    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m03
	
	I0831 15:31:16.972418    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:16.972554    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:16.972683    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.972793    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.972889    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:16.973016    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:16.973150    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:16.973161    2876 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m03' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m03/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m03' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:31:17.026608    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:31:17.026626    2876 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:31:17.026635    2876 buildroot.go:174] setting up certificates
	I0831 15:31:17.026641    2876 provision.go:84] configureAuth start
	I0831 15:31:17.026647    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:31:17.026793    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:31:17.026903    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:17.026995    2876 provision.go:143] copyHostCerts
	I0831 15:31:17.027029    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:31:17.027088    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:31:17.027094    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:31:17.027236    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:31:17.027433    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:31:17.027471    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:31:17.027477    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:31:17.027559    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:31:17.027700    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:31:17.027737    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:31:17.027742    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:31:17.027813    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:31:17.027956    2876 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m03 san=[127.0.0.1 192.169.0.7 ha-949000-m03 localhost minikube]
	I0831 15:31:17.258292    2876 provision.go:177] copyRemoteCerts
	I0831 15:31:17.258340    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:31:17.258353    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:17.258490    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:17.258583    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.258663    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:17.258746    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:31:17.289869    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:31:17.289967    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:31:17.308984    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:31:17.309048    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0831 15:31:17.328947    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:31:17.329010    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:31:17.348578    2876 provision.go:87] duration metric: took 321.944434ms to configureAuth
	I0831 15:31:17.348592    2876 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:31:17.348776    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:31:17.348791    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:17.348926    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:17.349023    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:17.349112    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.349190    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.349267    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:17.349365    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:17.349505    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:17.349513    2876 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:31:17.396974    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:31:17.396988    2876 buildroot.go:70] root file system type: tmpfs
	I0831 15:31:17.397075    2876 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:31:17.397087    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:17.397218    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:17.397314    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.397402    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.397507    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:17.397637    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:17.397789    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:17.397838    2876 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:31:17.455821    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:31:17.455842    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:17.455977    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:17.456072    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.456168    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.456252    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:17.456374    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:17.456520    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:17.456533    2876 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:31:19.032300    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:31:19.032316    2876 main.go:141] libmachine: Checking connection to Docker...
	I0831 15:31:19.032323    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetURL
	I0831 15:31:19.032456    2876 main.go:141] libmachine: Docker is up and running!
	I0831 15:31:19.032464    2876 main.go:141] libmachine: Reticulating splines...
	I0831 15:31:19.032468    2876 client.go:171] duration metric: took 14.391172658s to LocalClient.Create
	I0831 15:31:19.032480    2876 start.go:167] duration metric: took 14.391215349s to libmachine.API.Create "ha-949000"
	I0831 15:31:19.032489    2876 start.go:293] postStartSetup for "ha-949000-m03" (driver="hyperkit")
	I0831 15:31:19.032496    2876 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:31:19.032506    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:19.032660    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:31:19.032675    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:19.032767    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:19.032855    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:19.032947    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:19.033033    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:31:19.073938    2876 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:31:19.079886    2876 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:31:19.079901    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:31:19.080017    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:31:19.080199    2876 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:31:19.080206    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:31:19.080413    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:31:19.092434    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:31:19.119963    2876 start.go:296] duration metric: took 87.46929ms for postStartSetup
	I0831 15:31:19.119990    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetConfigRaw
	I0831 15:31:19.120591    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:31:19.120767    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:31:19.121161    2876 start.go:128] duration metric: took 14.512164484s to createHost
	I0831 15:31:19.121177    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:19.121269    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:19.121343    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:19.121419    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:19.121507    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:19.121631    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:19.121747    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:19.121754    2876 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:31:19.168319    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143479.023948613
	
	I0831 15:31:19.168331    2876 fix.go:216] guest clock: 1725143479.023948613
	I0831 15:31:19.168337    2876 fix.go:229] Guest: 2024-08-31 15:31:19.023948613 -0700 PDT Remote: 2024-08-31 15:31:19.12117 -0700 PDT m=+129.881500927 (delta=-97.221387ms)
	I0831 15:31:19.168349    2876 fix.go:200] guest clock delta is within tolerance: -97.221387ms
	I0831 15:31:19.168354    2876 start.go:83] releasing machines lock for "ha-949000-m03", held for 14.559521208s
	I0831 15:31:19.168370    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:19.168508    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:31:19.193570    2876 out.go:177] * Found network options:
	I0831 15:31:19.255565    2876 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6
	W0831 15:31:19.295062    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:31:19.295088    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:31:19.295104    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:19.295822    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:19.296008    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:19.296101    2876 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:31:19.296130    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	W0831 15:31:19.296153    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:31:19.296165    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:31:19.296225    2876 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:31:19.296229    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:19.296236    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:19.296334    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:19.296350    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:19.296442    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:19.296455    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:19.296560    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:31:19.296581    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:19.296680    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	W0831 15:31:19.323572    2876 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:31:19.323629    2876 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:31:19.371272    2876 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:31:19.371294    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:31:19.371393    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:31:19.387591    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:31:19.396789    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:31:19.405160    2876 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:31:19.405208    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:31:19.413496    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:31:19.422096    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:31:19.430386    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:31:19.438699    2876 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:31:19.447187    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:31:19.455984    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:31:19.464947    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:31:19.474438    2876 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:31:19.482528    2876 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:31:19.490487    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:19.582349    2876 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:31:19.599985    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:31:19.600056    2876 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:31:19.612555    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:31:19.632269    2876 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:31:19.650343    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:31:19.661102    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:31:19.671812    2876 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:31:19.695791    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:31:19.706786    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:31:19.722246    2876 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:31:19.725125    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:31:19.732176    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:31:19.745845    2876 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:31:19.848832    2876 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:31:19.960260    2876 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:31:19.960281    2876 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:31:19.974005    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:20.073538    2876 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:31:22.469978    2876 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.396488217s)
	I0831 15:31:22.470044    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:31:22.482132    2876 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:31:22.494892    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:31:22.505113    2876 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:31:22.597737    2876 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:31:22.715451    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:22.823995    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:31:22.837904    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:31:22.849106    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:22.943937    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:31:23.002374    2876 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:31:23.002452    2876 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:31:23.006859    2876 start.go:563] Will wait 60s for crictl version
	I0831 15:31:23.006916    2876 ssh_runner.go:195] Run: which crictl
	I0831 15:31:23.010129    2876 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:31:23.037227    2876 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:31:23.037307    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:31:23.056021    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:31:23.095679    2876 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:31:23.119303    2876 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:31:23.162269    2876 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0831 15:31:23.183203    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:31:23.183553    2876 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:31:23.187788    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:31:23.197219    2876 mustload.go:65] Loading cluster: ha-949000
	I0831 15:31:23.197405    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:31:23.197647    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:31:23.197669    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:31:23.206705    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51151
	I0831 15:31:23.207061    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:31:23.207432    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:31:23.207448    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:31:23.207666    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:31:23.207786    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:31:23.207874    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:23.207946    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:31:23.208928    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:31:23.209186    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:31:23.209220    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:31:23.218074    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51153
	I0831 15:31:23.218433    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:31:23.218804    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:31:23.218819    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:31:23.219039    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:31:23.219165    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:31:23.219284    2876 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.7
	I0831 15:31:23.219289    2876 certs.go:194] generating shared ca certs ...
	I0831 15:31:23.219301    2876 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:31:23.219493    2876 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:31:23.219569    2876 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:31:23.219578    2876 certs.go:256] generating profile certs ...
	I0831 15:31:23.219685    2876 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:31:23.219705    2876 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.0c0868f3
	I0831 15:31:23.219719    2876 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.0c0868f3 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.7 192.169.0.254]
	I0831 15:31:23.437317    2876 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.0c0868f3 ...
	I0831 15:31:23.437340    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.0c0868f3: {Name:mk58aa028a0f003ebc9e4d90dc317cdac139f88f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:31:23.437643    2876 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.0c0868f3 ...
	I0831 15:31:23.437656    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.0c0868f3: {Name:mkaffb8ad3060932ca991ed93b1f8350d31a48ee Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:31:23.437859    2876 certs.go:381] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.0c0868f3 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt
	I0831 15:31:23.438064    2876 certs.go:385] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.0c0868f3 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key
	I0831 15:31:23.438321    2876 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:31:23.438330    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:31:23.438352    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:31:23.438370    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:31:23.438423    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:31:23.438445    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:31:23.438467    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:31:23.438484    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:31:23.438502    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:31:23.438598    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:31:23.438648    2876 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:31:23.438657    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:31:23.438698    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:31:23.438737    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:31:23.438775    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:31:23.438861    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:31:23.438902    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:31:23.438923    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:31:23.438941    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:31:23.438970    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:31:23.439126    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:31:23.439259    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:31:23.439370    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:31:23.439494    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:31:23.472129    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0831 15:31:23.475604    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0831 15:31:23.483468    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0831 15:31:23.486771    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0831 15:31:23.494732    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0831 15:31:23.497856    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0831 15:31:23.505900    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0831 15:31:23.509221    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1679 bytes)
	I0831 15:31:23.517853    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0831 15:31:23.521110    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0831 15:31:23.529522    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0831 15:31:23.532921    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0831 15:31:23.540561    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:31:23.560999    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:31:23.580941    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:31:23.601890    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:31:23.621742    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1444 bytes)
	I0831 15:31:23.642294    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0831 15:31:23.662119    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:31:23.682734    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:31:23.702621    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:31:23.722704    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:31:23.743032    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:31:23.763003    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0831 15:31:23.776540    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0831 15:31:23.790112    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0831 15:31:23.803743    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1679 bytes)
	I0831 15:31:23.817470    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0831 15:31:23.831871    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0831 15:31:23.845310    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0831 15:31:23.858947    2876 ssh_runner.go:195] Run: openssl version
	I0831 15:31:23.863254    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:31:23.871668    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:31:23.875114    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:31:23.875147    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:31:23.879499    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:31:23.888263    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:31:23.896800    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:31:23.900783    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:31:23.900840    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:31:23.905239    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:31:23.913677    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:31:23.921998    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:31:23.925382    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:31:23.925421    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:31:23.929547    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:31:23.938211    2876 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:31:23.941244    2876 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 15:31:23.941280    2876 kubeadm.go:934] updating node {m03 192.169.0.7 8443 v1.31.0 docker true true} ...
	I0831 15:31:23.941346    2876 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m03 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.7
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:31:23.941365    2876 kube-vip.go:115] generating kube-vip config ...
	I0831 15:31:23.941403    2876 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:31:23.953552    2876 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:31:23.953594    2876 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:31:23.953640    2876 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:31:23.961797    2876 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.31.0: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.31.0': No such file or directory
	
	Initiating transfer...
	I0831 15:31:23.961850    2876 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.31.0
	I0831 15:31:23.970244    2876 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubeadm.sha256
	I0831 15:31:23.970245    2876 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl.sha256
	I0831 15:31:23.970248    2876 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubelet.sha256
	I0831 15:31:23.970260    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm -> /var/lib/minikube/binaries/v1.31.0/kubeadm
	I0831 15:31:23.970262    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl -> /var/lib/minikube/binaries/v1.31.0/kubectl
	I0831 15:31:23.970297    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:31:23.970351    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubeadm
	I0831 15:31:23.970358    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubectl
	I0831 15:31:23.982898    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet -> /var/lib/minikube/binaries/v1.31.0/kubelet
	I0831 15:31:23.982926    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubeadm: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubeadm': No such file or directory
	I0831 15:31:23.982950    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm --> /var/lib/minikube/binaries/v1.31.0/kubeadm (58290328 bytes)
	I0831 15:31:23.982949    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubectl: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubectl': No such file or directory
	I0831 15:31:23.982968    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl --> /var/lib/minikube/binaries/v1.31.0/kubectl (56381592 bytes)
	I0831 15:31:23.983039    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubelet
	I0831 15:31:24.006648    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubelet: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubelet': No such file or directory
	I0831 15:31:24.006684    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet --> /var/lib/minikube/binaries/v1.31.0/kubelet (76865848 bytes)
	I0831 15:31:24.520609    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0831 15:31:24.528302    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:31:24.542845    2876 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:31:24.556549    2876 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:31:24.581157    2876 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:31:24.584179    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:31:24.593696    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:24.689916    2876 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:31:24.707403    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:31:24.707700    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:31:24.707728    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:31:24.717047    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51156
	I0831 15:31:24.717380    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:31:24.717728    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:31:24.717743    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:31:24.718003    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:31:24.718123    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:31:24.718213    2876 start.go:317] joinCluster: &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 Clu
sterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:fals
e inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimi
zations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:31:24.718336    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm token create --print-join-command --ttl=0"
	I0831 15:31:24.718349    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:31:24.718430    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:31:24.718495    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:31:24.718573    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:31:24.718638    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:31:24.810129    2876 start.go:343] trying to join control-plane node "m03" to cluster: &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:31:24.810181    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm join control-plane.minikube.internal:8443 --token l0ka7f.9kdk1py3wyogvy9t --discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-949000-m03 --control-plane --apiserver-advertise-address=192.169.0.7 --apiserver-bind-port=8443"
	I0831 15:31:52.526613    2876 ssh_runner.go:235] Completed: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm join control-plane.minikube.internal:8443 --token l0ka7f.9kdk1py3wyogvy9t --discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-949000-m03 --control-plane --apiserver-advertise-address=192.169.0.7 --apiserver-bind-port=8443": (27.716564604s)
	I0831 15:31:52.526639    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo systemctl daemon-reload && sudo systemctl enable kubelet && sudo systemctl start kubelet"
	I0831 15:31:53.011028    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-949000-m03 minikube.k8s.io/updated_at=2024_08_31T15_31_53_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2 minikube.k8s.io/name=ha-949000 minikube.k8s.io/primary=false
	I0831 15:31:53.087862    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig taint nodes ha-949000-m03 node-role.kubernetes.io/control-plane:NoSchedule-
	I0831 15:31:53.172826    2876 start.go:319] duration metric: took 28.454760565s to joinCluster
	I0831 15:31:53.172884    2876 start.go:235] Will wait 6m0s for node &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:31:53.173075    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:31:53.197446    2876 out.go:177] * Verifying Kubernetes components...
	I0831 15:31:53.254031    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:53.535623    2876 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:31:53.558317    2876 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:31:53.558557    2876 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x48c7c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:31:53.558593    2876 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:31:53.558836    2876 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m03" to be "Ready" ...
	I0831 15:31:53.558893    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:53.558899    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:53.558906    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:53.558909    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:53.561151    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:54.058994    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:54.059009    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:54.059015    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:54.059020    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:54.061381    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:54.559376    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:54.559389    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:54.559396    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:54.559399    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:54.561772    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:55.059628    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:55.059676    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:55.059690    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:55.059700    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:55.063078    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:55.559418    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:55.559433    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:55.559439    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:55.559442    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:55.561338    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:55.561664    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:31:56.059758    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:56.059770    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:56.059776    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:56.059780    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:56.061794    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:56.560083    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:56.560095    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:56.560101    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:56.560105    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:56.562114    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:57.058995    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:57.059011    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:57.059017    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:57.059021    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:57.060963    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:57.560137    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:57.560149    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:57.560155    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:57.560159    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:57.561978    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:57.562328    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:31:58.059061    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:58.059074    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:58.059080    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:58.059086    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:58.061472    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:58.559244    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:58.559270    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:58.559282    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:58.559289    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:58.562722    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:59.060308    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:59.060330    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:59.060342    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:59.060359    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:59.063517    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:59.560099    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:59.560116    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:59.560125    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:59.560129    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:59.562184    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:59.562628    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:00.059591    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:00.059615    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:00.059662    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:00.059677    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:00.063389    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:00.560430    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:00.560444    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:00.560451    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:00.560455    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:00.562483    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:01.059473    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:01.059498    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:01.059509    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:01.059514    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:01.062773    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:01.559271    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:01.559298    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:01.559310    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:01.559317    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:01.562641    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:01.563242    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:02.060140    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:02.060168    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:02.060211    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:02.060244    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:02.063601    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:02.559282    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:02.559308    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:02.559320    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:02.559329    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:02.562623    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:03.059890    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:03.059911    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:03.059923    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:03.059930    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:03.063409    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:03.559394    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:03.559453    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:03.559465    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:03.559470    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:03.562567    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:04.060698    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:04.060714    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:04.060719    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:04.060727    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:04.062955    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:04.063278    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:04.560096    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:04.560118    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:04.560165    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:04.560173    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:04.562791    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:05.060622    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:05.060648    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:05.060659    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:05.060665    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:05.064011    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:05.559954    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:05.559976    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:05.559988    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:05.559994    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:05.563422    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:06.059812    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:06.059870    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:06.059880    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:06.059886    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:06.062529    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:06.560071    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:06.560096    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:06.560107    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:06.560113    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:06.563538    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:06.564037    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:07.059298    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:07.059324    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:07.059335    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:07.059342    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:07.063048    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:07.559252    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:07.559277    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:07.559291    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:07.559297    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:07.562373    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:08.061149    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:08.061210    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:08.061223    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:08.061234    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:08.064402    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:08.559428    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:08.559452    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:08.559463    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:08.559468    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:08.562526    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:09.060827    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:09.060878    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:09.060891    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:09.060900    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:09.063954    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:09.064537    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:09.561212    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:09.561237    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:09.561283    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:09.561292    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:09.564677    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:10.060675    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:10.060694    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:10.060714    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:10.060718    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:10.062779    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:10.560397    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:10.560424    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:10.560435    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:10.560441    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:10.564079    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:11.060679    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:11.060705    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:11.060716    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:11.060722    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:11.064114    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:11.559466    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:11.559492    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:11.559503    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:11.559567    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:11.562752    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:11.563402    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:12.059348    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:12.059373    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:12.059384    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:12.059389    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:12.062810    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:12.561048    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:12.561106    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:12.561120    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:12.561141    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:12.564459    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:13.059831    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:13.059855    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.059867    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.059873    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.063079    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:13.063582    2876 node_ready.go:49] node "ha-949000-m03" has status "Ready":"True"
	I0831 15:32:13.063594    2876 node_ready.go:38] duration metric: took 19.504599366s for node "ha-949000-m03" to be "Ready" ...
	I0831 15:32:13.063602    2876 pod_ready.go:36] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:32:13.063657    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:32:13.063665    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.063674    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.063682    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.067458    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:13.072324    2876 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.072373    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:32:13.072379    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.072385    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.072389    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.074327    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.074802    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:13.074810    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.074815    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.074820    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.076654    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.076987    2876 pod_ready.go:93] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.076996    2876 pod_ready.go:82] duration metric: took 4.661444ms for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.077003    2876 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.077041    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-snq8s
	I0831 15:32:13.077046    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.077052    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.077056    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.078862    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.079264    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:13.079271    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.079277    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.079280    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.081027    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.081326    2876 pod_ready.go:93] pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.081335    2876 pod_ready.go:82] duration metric: took 4.326858ms for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.081342    2876 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.081372    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000
	I0831 15:32:13.081379    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.081385    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.081388    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.083263    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.083632    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:13.083639    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.083645    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.083649    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.085181    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.085480    2876 pod_ready.go:93] pod "etcd-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.085490    2876 pod_ready.go:82] duration metric: took 4.142531ms for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.085497    2876 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.085526    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m02
	I0831 15:32:13.085531    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.085537    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.085541    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.087128    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.087501    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:13.087508    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.087513    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.087518    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.088959    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.089244    2876 pod_ready.go:93] pod "etcd-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.089252    2876 pod_ready.go:82] duration metric: took 3.751049ms for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.089258    2876 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.261887    2876 request.go:632] Waited for 172.592535ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m03
	I0831 15:32:13.261972    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m03
	I0831 15:32:13.261978    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.262019    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.262028    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.264296    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:13.460589    2876 request.go:632] Waited for 195.842812ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:13.460724    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:13.460735    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.460745    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.460759    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.463962    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:13.464378    2876 pod_ready.go:93] pod "etcd-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.464391    2876 pod_ready.go:82] duration metric: took 375.12348ms for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.464404    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.661862    2876 request.go:632] Waited for 197.406518ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:32:13.661977    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:32:13.661988    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.661999    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.662005    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.665393    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:13.861181    2876 request.go:632] Waited for 195.385788ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:13.861214    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:13.861218    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.861225    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.861260    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.863261    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.863567    2876 pod_ready.go:93] pod "kube-apiserver-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.863577    2876 pod_ready.go:82] duration metric: took 399.161484ms for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.863584    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:14.061861    2876 request.go:632] Waited for 198.232413ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:32:14.061952    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:32:14.061961    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:14.061972    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:14.061979    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:14.064530    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:14.260004    2876 request.go:632] Waited for 194.98208ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:14.260143    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:14.260166    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:14.260182    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:14.260227    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:14.266580    2876 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:32:14.266908    2876 pod_ready.go:93] pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:14.266927    2876 pod_ready.go:82] duration metric: took 403.325368ms for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:14.266937    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:14.460025    2876 request.go:632] Waited for 193.045445ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:32:14.460093    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:32:14.460101    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:14.460110    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:14.460117    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:14.462588    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:14.660940    2876 request.go:632] Waited for 197.721547ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:14.661070    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:14.661080    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:14.661096    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:14.661109    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:14.664541    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:14.664954    2876 pod_ready.go:93] pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:14.664967    2876 pod_ready.go:82] duration metric: took 398.020825ms for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:14.664979    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:14.861147    2876 request.go:632] Waited for 196.115866ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:32:14.861203    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:32:14.861211    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:14.861223    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:14.861231    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:14.864847    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:15.060912    2876 request.go:632] Waited for 195.310518ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:15.060968    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:15.060983    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:15.061000    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:15.061011    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:15.064271    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:15.064583    2876 pod_ready.go:93] pod "kube-controller-manager-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:15.064594    2876 pod_ready.go:82] duration metric: took 399.604845ms for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:15.064603    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:15.260515    2876 request.go:632] Waited for 195.841074ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:32:15.260662    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:32:15.260676    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:15.260688    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:15.260702    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:15.264411    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:15.461372    2876 request.go:632] Waited for 196.432681ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:15.461470    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:15.461484    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:15.461502    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:15.461513    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:15.464382    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:15.464683    2876 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:15.464691    2876 pod_ready.go:82] duration metric: took 400.078711ms for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:15.464700    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:15.660288    2876 request.go:632] Waited for 195.551444ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:32:15.660318    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:32:15.660323    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:15.660357    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:15.660363    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:15.663247    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:15.860473    2876 request.go:632] Waited for 196.823661ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:15.860532    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:15.860542    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:15.860556    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:15.860563    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:15.863954    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:15.864333    2876 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:15.864346    2876 pod_ready.go:82] duration metric: took 399.636293ms for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:15.864355    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:16.060306    2876 request.go:632] Waited for 195.900703ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:32:16.060410    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:32:16.060437    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:16.060449    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:16.060455    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:16.063745    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:16.260402    2876 request.go:632] Waited for 195.997957ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:16.260523    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:16.260539    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:16.260551    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:16.260563    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:16.264052    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:16.264373    2876 pod_ready.go:93] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:16.264385    2876 pod_ready.go:82] duration metric: took 400.01997ms for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:16.264394    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:16.461128    2876 request.go:632] Waited for 196.682855ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-d45q5
	I0831 15:32:16.461251    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-d45q5
	I0831 15:32:16.461264    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:16.461275    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:16.461282    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:16.464602    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:16.660248    2876 request.go:632] Waited for 195.08291ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:16.660298    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:16.660310    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:16.660327    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:16.660340    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:16.663471    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:16.664017    2876 pod_ready.go:93] pod "kube-proxy-d45q5" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:16.664029    2876 pod_ready.go:82] duration metric: took 399.623986ms for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:16.664038    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:16.859948    2876 request.go:632] Waited for 195.845325ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:32:16.860034    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:32:16.860060    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:16.860083    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:16.860094    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:16.863263    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:17.060250    2876 request.go:632] Waited for 196.410574ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:17.060307    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:17.060319    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:17.060334    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:17.060345    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:17.063664    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:17.064113    2876 pod_ready.go:93] pod "kube-proxy-q7ndn" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:17.064125    2876 pod_ready.go:82] duration metric: took 400.076522ms for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:17.064134    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:17.260150    2876 request.go:632] Waited for 195.935266ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:32:17.260232    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:32:17.260246    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:17.260305    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:17.260324    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:17.263756    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:17.460703    2876 request.go:632] Waited for 196.426241ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:17.460753    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:17.460765    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:17.460776    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:17.460799    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:17.463925    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:17.464439    2876 pod_ready.go:93] pod "kube-scheduler-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:17.464449    2876 pod_ready.go:82] duration metric: took 400.306164ms for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:17.464463    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:17.660506    2876 request.go:632] Waited for 196.00354ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:32:17.660541    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:32:17.660547    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:17.660553    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:17.660568    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:17.662504    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:17.859973    2876 request.go:632] Waited for 197.106962ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:17.860023    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:17.860031    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:17.860084    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:17.860092    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:17.869330    2876 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0831 15:32:17.869629    2876 pod_ready.go:93] pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:17.869638    2876 pod_ready.go:82] duration metric: took 405.16449ms for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:17.869646    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:18.060370    2876 request.go:632] Waited for 190.671952ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:32:18.060479    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:32:18.060492    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.060504    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.060511    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.063196    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:18.260902    2876 request.go:632] Waited for 197.387182ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:18.260947    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:18.260955    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.260976    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.261000    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.263780    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:18.264154    2876 pod_ready.go:93] pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:18.264163    2876 pod_ready.go:82] duration metric: took 394.508983ms for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:18.264171    2876 pod_ready.go:39] duration metric: took 5.200505122s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:32:18.264182    2876 api_server.go:52] waiting for apiserver process to appear ...
	I0831 15:32:18.264235    2876 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:32:18.276016    2876 api_server.go:72] duration metric: took 25.102905505s to wait for apiserver process to appear ...
	I0831 15:32:18.276029    2876 api_server.go:88] waiting for apiserver healthz status ...
	I0831 15:32:18.276040    2876 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0831 15:32:18.280474    2876 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0831 15:32:18.280519    2876 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0831 15:32:18.280525    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.280531    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.280535    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.281148    2876 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 15:32:18.281176    2876 api_server.go:141] control plane version: v1.31.0
	I0831 15:32:18.281184    2876 api_server.go:131] duration metric: took 5.150155ms to wait for apiserver health ...
	I0831 15:32:18.281189    2876 system_pods.go:43] waiting for kube-system pods to appear ...
	I0831 15:32:18.460471    2876 request.go:632] Waited for 179.236076ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:32:18.460573    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:32:18.460585    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.460596    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.460604    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.465317    2876 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:32:18.469906    2876 system_pods.go:59] 24 kube-system pods found
	I0831 15:32:18.469918    2876 system_pods.go:61] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:32:18.469921    2876 system_pods.go:61] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:32:18.469925    2876 system_pods.go:61] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:32:18.469928    2876 system_pods.go:61] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:32:18.469933    2876 system_pods.go:61] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:32:18.469937    2876 system_pods.go:61] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:32:18.469939    2876 system_pods.go:61] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:32:18.469943    2876 system_pods.go:61] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:32:18.469946    2876 system_pods.go:61] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:32:18.469949    2876 system_pods.go:61] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:32:18.469954    2876 system_pods.go:61] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:32:18.469958    2876 system_pods.go:61] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:32:18.469961    2876 system_pods.go:61] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:32:18.469963    2876 system_pods.go:61] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:32:18.469966    2876 system_pods.go:61] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:32:18.469969    2876 system_pods.go:61] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:32:18.469972    2876 system_pods.go:61] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:32:18.469975    2876 system_pods.go:61] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:32:18.469978    2876 system_pods.go:61] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:32:18.469980    2876 system_pods.go:61] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:32:18.469983    2876 system_pods.go:61] "kube-vip-ha-949000" [933b8e54-299e-44c1-8dea-69aba92adbd4] Running
	I0831 15:32:18.469985    2876 system_pods.go:61] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:32:18.469988    2876 system_pods.go:61] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:32:18.469990    2876 system_pods.go:61] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:32:18.469994    2876 system_pods.go:74] duration metric: took 188.799972ms to wait for pod list to return data ...
	I0831 15:32:18.470000    2876 default_sa.go:34] waiting for default service account to be created ...
	I0831 15:32:18.659945    2876 request.go:632] Waited for 189.894855ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:32:18.659986    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:32:18.660002    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.660011    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.660017    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.662843    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:18.662901    2876 default_sa.go:45] found service account: "default"
	I0831 15:32:18.662910    2876 default_sa.go:55] duration metric: took 192.903479ms for default service account to be created ...
	I0831 15:32:18.662915    2876 system_pods.go:116] waiting for k8s-apps to be running ...
	I0831 15:32:18.860267    2876 request.go:632] Waited for 197.296928ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:32:18.860299    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:32:18.860304    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.860310    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.860316    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.864052    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:18.868873    2876 system_pods.go:86] 24 kube-system pods found
	I0831 15:32:18.868886    2876 system_pods.go:89] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:32:18.868891    2876 system_pods.go:89] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:32:18.868894    2876 system_pods.go:89] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:32:18.868897    2876 system_pods.go:89] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:32:18.868901    2876 system_pods.go:89] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:32:18.868904    2876 system_pods.go:89] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:32:18.868907    2876 system_pods.go:89] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:32:18.868912    2876 system_pods.go:89] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:32:18.868916    2876 system_pods.go:89] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:32:18.868918    2876 system_pods.go:89] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:32:18.868922    2876 system_pods.go:89] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:32:18.868927    2876 system_pods.go:89] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:32:18.868931    2876 system_pods.go:89] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:32:18.868934    2876 system_pods.go:89] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:32:18.868938    2876 system_pods.go:89] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:32:18.868941    2876 system_pods.go:89] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:32:18.868944    2876 system_pods.go:89] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:32:18.868947    2876 system_pods.go:89] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:32:18.868950    2876 system_pods.go:89] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:32:18.868953    2876 system_pods.go:89] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:32:18.868957    2876 system_pods.go:89] "kube-vip-ha-949000" [933b8e54-299e-44c1-8dea-69aba92adbd4] Running
	I0831 15:32:18.868959    2876 system_pods.go:89] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:32:18.868963    2876 system_pods.go:89] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:32:18.868966    2876 system_pods.go:89] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:32:18.868971    2876 system_pods.go:126] duration metric: took 206.049826ms to wait for k8s-apps to be running ...
	I0831 15:32:18.868980    2876 system_svc.go:44] waiting for kubelet service to be running ....
	I0831 15:32:18.869030    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:32:18.880958    2876 system_svc.go:56] duration metric: took 11.976044ms WaitForService to wait for kubelet
	I0831 15:32:18.880978    2876 kubeadm.go:582] duration metric: took 25.707859659s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:32:18.880990    2876 node_conditions.go:102] verifying NodePressure condition ...
	I0831 15:32:19.060320    2876 request.go:632] Waited for 179.26426ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0831 15:32:19.060365    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0831 15:32:19.060371    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:19.060379    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:19.060385    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:19.063168    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:19.063767    2876 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:32:19.063776    2876 node_conditions.go:123] node cpu capacity is 2
	I0831 15:32:19.063782    2876 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:32:19.063785    2876 node_conditions.go:123] node cpu capacity is 2
	I0831 15:32:19.063789    2876 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:32:19.063791    2876 node_conditions.go:123] node cpu capacity is 2
	I0831 15:32:19.063794    2876 node_conditions.go:105] duration metric: took 182.798166ms to run NodePressure ...
	I0831 15:32:19.063802    2876 start.go:241] waiting for startup goroutines ...
	I0831 15:32:19.063817    2876 start.go:255] writing updated cluster config ...
	I0831 15:32:19.064186    2876 ssh_runner.go:195] Run: rm -f paused
	I0831 15:32:19.107477    2876 start.go:600] kubectl: 1.29.2, cluster: 1.31.0 (minor skew: 2)
	I0831 15:32:19.128559    2876 out.go:201] 
	W0831 15:32:19.149451    2876 out.go:270] ! /usr/local/bin/kubectl is version 1.29.2, which may have incompatibilities with Kubernetes 1.31.0.
	I0831 15:32:19.170407    2876 out.go:177]   - Want kubectl v1.31.0? Try 'minikube kubectl -- get pods -A'
	I0831 15:32:19.212551    2876 out.go:177] * Done! kubectl is now configured to use "ha-949000" cluster and "default" namespace by default
	
	
	==> Docker <==
	Aug 31 22:30:08 ha-949000 cri-dockerd[1172]: time="2024-08-31T22:30:08Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/7da75377db13c80b27b99ccc9f52561a4408675361947cf393e0c38286a71997/resolv.conf as [nameserver 192.169.0.1]"
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.201910840Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.202112013Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.202132705Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.202328611Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:30:08 ha-949000 cri-dockerd[1172]: time="2024-08-31T22:30:08Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/1017bd5eac1d26de2df318c0dc0ac8d5db92d72e8c268401502a145b3ad0d9d8/resolv.conf as [nameserver 192.169.0.1]"
	Aug 31 22:30:08 ha-949000 cri-dockerd[1172]: time="2024-08-31T22:30:08Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/271da20951c9ab4102e979dc2b97b3a9c8d992db5fc7ebac3f954ea9edee9d48/resolv.conf as [nameserver 192.169.0.1]"
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.346950244Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.347136993Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.347223771Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.347348772Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.379063396Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.379210402Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.379226413Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.379336044Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:32:21 ha-949000 dockerd[1279]: time="2024-08-31T22:32:21.320619490Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:32:21 ha-949000 dockerd[1279]: time="2024-08-31T22:32:21.320945499Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:32:21 ha-949000 dockerd[1279]: time="2024-08-31T22:32:21.321018153Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:32:21 ha-949000 dockerd[1279]: time="2024-08-31T22:32:21.321131565Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:32:21 ha-949000 cri-dockerd[1172]: time="2024-08-31T22:32:21Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/f68483c946835415bfdf0531bfc6be41dd321162f4c19af555ece0f66ee7cabe/resolv.conf as [nameserver 10.96.0.10 search default.svc.cluster.local svc.cluster.local cluster.local options ndots:5]"
	Aug 31 22:32:22 ha-949000 cri-dockerd[1172]: time="2024-08-31T22:32:22Z" level=info msg="Stop pulling image gcr.io/k8s-minikube/busybox:1.28: Status: Downloaded newer image for gcr.io/k8s-minikube/busybox:1.28"
	Aug 31 22:32:22 ha-949000 dockerd[1279]: time="2024-08-31T22:32:22.716842379Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:32:22 ha-949000 dockerd[1279]: time="2024-08-31T22:32:22.716906766Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:32:22 ha-949000 dockerd[1279]: time="2024-08-31T22:32:22.716920530Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:32:22 ha-949000 dockerd[1279]: time="2024-08-31T22:32:22.721236974Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                 CREATED              STATE               NAME                      ATTEMPT             POD ID              POD
	2f925f16b74b0       gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12   About a minute ago   Running             busybox                   0                   f68483c946835       busybox-7dff88458-5kkbw
	b1db836cd7a3d       cbb01a7bd410d                                                                                         3 minutes ago        Running             coredns                   0                   271da20951c9a       coredns-6f6b679f8f-kjszm
	def4d6bd20bc5       cbb01a7bd410d                                                                                         3 minutes ago        Running             coredns                   0                   1017bd5eac1d2       coredns-6f6b679f8f-snq8s
	22fbb8a8e01ad       6e38f40d628db                                                                                         3 minutes ago        Running             storage-provisioner       0                   7da75377db13c       storage-provisioner
	6d156ce626115       kindest/kindnetd@sha256:e59a687ca28ae274a2fc92f1e2f5f1c739f353178a43a23aafc71adb802ed166              3 minutes ago        Running             kindnet-cni               0                   7d1851c17485c       kindnet-jzj42
	54d5f8041c89d       ad83b2ca7b09e                                                                                         3 minutes ago        Running             kube-proxy                0                   4b0198ac7dc52       kube-proxy-q7ndn
	c99fe831b20c1       ghcr.io/kube-vip/kube-vip@sha256:360f0c5d02322075cc80edb9e4e0d2171e941e55072184f1f902203fafc81d0f     4 minutes ago        Running             kube-vip                  0                   9ef7e0fa361d5       kube-vip-ha-949000
	c734c23a53082       2e96e5913fc06                                                                                         4 minutes ago        Running             etcd                      0                   7cfaf9f5d4dd4       etcd-ha-949000
	02c10e4f765d1       1766f54c897f0                                                                                         4 minutes ago        Running             kube-scheduler            0                   c084f2a259f6c       kube-scheduler-ha-949000
	6670fd34164cb       045733566833c                                                                                         4 minutes ago        Running             kube-controller-manager   0                   f9573e28f9d4d       kube-controller-manager-ha-949000
	ffec6106be6c8       604f5db92eaa8                                                                                         4 minutes ago        Running             kube-apiserver            0                   25c49852f78dc       kube-apiserver-ha-949000
	
	
	==> coredns [b1db836cd7a3] <==
	[INFO] 10.244.1.2:56414 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000107837s
	[INFO] 10.244.1.2:53184 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000079726s
	[INFO] 10.244.1.2:58757 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000418868s
	[INFO] 10.244.1.2:39299 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000067106s
	[INFO] 10.244.2.2:56948 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000080585s
	[INFO] 10.244.2.2:56973 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000078985s
	[INFO] 10.244.2.2:43081 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000100123s
	[INFO] 10.244.2.2:56390 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000040214s
	[INFO] 10.244.2.2:52519 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000061255s
	[INFO] 10.244.0.4:36226 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000151133s
	[INFO] 10.244.1.2:44017 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089111s
	[INFO] 10.244.1.2:37224 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000069144s
	[INFO] 10.244.1.2:51282 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000118723s
	[INFO] 10.244.2.2:35009 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089507s
	[INFO] 10.244.2.2:60607 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000049176s
	[INFO] 10.244.2.2:36851 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000097758s
	[INFO] 10.244.0.4:59717 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000053986s
	[INFO] 10.244.0.4:58447 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000060419s
	[INFO] 10.244.1.2:60381 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000136898s
	[INFO] 10.244.1.2:32783 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.00010303s
	[INFO] 10.244.1.2:44904 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000042493s
	[INFO] 10.244.1.2:44085 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000132084s
	[INFO] 10.244.2.2:43635 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000080947s
	[INFO] 10.244.2.2:40020 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000081919s
	[INFO] 10.244.2.2:53730 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000058015s
	
	
	==> coredns [def4d6bd20bc] <==
	[INFO] 10.244.0.4:41865 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 140 0.008744161s
	[INFO] 10.244.1.2:50080 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000093199s
	[INFO] 10.244.1.2:55576 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.000574417s
	[INFO] 10.244.1.2:36293 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000065455s
	[INFO] 10.244.2.2:41223 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000063892s
	[INFO] 10.244.0.4:54135 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000096141s
	[INFO] 10.244.0.4:39176 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000742646s
	[INFO] 10.244.0.4:58445 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000080113s
	[INFO] 10.244.0.4:56242 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000066269s
	[INFO] 10.244.0.4:60657 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000049645s
	[INFO] 10.244.1.2:48306 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000561931s
	[INFO] 10.244.1.2:40767 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000077826s
	[INFO] 10.244.1.2:35669 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000056994s
	[INFO] 10.244.1.2:57720 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000040565s
	[INFO] 10.244.2.2:38794 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000136901s
	[INFO] 10.244.2.2:33576 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000052374s
	[INFO] 10.244.2.2:57053 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000051289s
	[INFO] 10.244.0.4:47623 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000056903s
	[INFO] 10.244.0.4:59818 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.00003011s
	[INFO] 10.244.0.4:53586 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000029565s
	[INFO] 10.244.1.2:60045 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000060878s
	[INFO] 10.244.2.2:38400 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000078624s
	[INFO] 10.244.0.4:58765 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000075707s
	[INFO] 10.244.0.4:32804 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000050785s
	[INFO] 10.244.2.2:48459 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.00007773s
	
	
	==> describe nodes <==
	Name:               ha-949000
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-949000
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=ha-949000
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_08_31T15_29_45_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:29:41 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-949000
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:33:38 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:32:48 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:32:48 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:32:48 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:32:48 +0000   Sat, 31 Aug 2024 22:30:07 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.5
	  Hostname:    ha-949000
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 e8535f0b09e14aea8b2456a9d977fc80
	  System UUID:                98ca49d1-0000-0000-9e6c-321a4533d56e
	  Boot ID:                    4896b77b-e0f4-43c0-af0e-3998b4352bec
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-5kkbw              0 (0%)        0 (0%)      0 (0%)           0 (0%)         85s
	  kube-system                 coredns-6f6b679f8f-kjszm             100m (5%)     0 (0%)      70Mi (3%)        170Mi (8%)     3m56s
	  kube-system                 coredns-6f6b679f8f-snq8s             100m (5%)     0 (0%)      70Mi (3%)        170Mi (8%)     3m56s
	  kube-system                 etcd-ha-949000                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         4m1s
	  kube-system                 kindnet-jzj42                        100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      3m57s
	  kube-system                 kube-apiserver-ha-949000             250m (12%)    0 (0%)      0 (0%)           0 (0%)         4m2s
	  kube-system                 kube-controller-manager-ha-949000    200m (10%)    0 (0%)      0 (0%)           0 (0%)         4m1s
	  kube-system                 kube-proxy-q7ndn                     0 (0%)        0 (0%)      0 (0%)           0 (0%)         3m57s
	  kube-system                 kube-scheduler-ha-949000             100m (5%)     0 (0%)      0 (0%)           0 (0%)         4m3s
	  kube-system                 kube-vip-ha-949000                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m3s
	  kube-system                 storage-provisioner                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         3m57s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%)   100m (5%)
	  memory             290Mi (13%)  390Mi (18%)
	  ephemeral-storage  0 (0%)       0 (0%)
	  hugepages-2Mi      0 (0%)       0 (0%)
	Events:
	  Type    Reason                   Age    From             Message
	  ----    ------                   ----   ----             -------
	  Normal  Starting                 3m55s  kube-proxy       
	  Normal  Starting                 4m1s   kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  4m1s   kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  4m1s   kubelet          Node ha-949000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    4m1s   kubelet          Node ha-949000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     4m1s   kubelet          Node ha-949000 status is now: NodeHasSufficientPID
	  Normal  RegisteredNode           3m57s  node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  NodeReady                3m38s  kubelet          Node ha-949000 status is now: NodeReady
	  Normal  RegisteredNode           2m57s  node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           107s   node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	
	
	Name:               ha-949000-m02
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-949000-m02
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=ha-949000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_08_31T15_30_43_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:30:41 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-949000-m02
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:33:44 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:32:43 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:32:43 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:32:43 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:32:43 +0000   Sat, 31 Aug 2024 22:31:00 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.6
	  Hostname:    ha-949000-m02
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 31d5d81c627e4d65bfa15e4c54f7f7c1
	  System UUID:                23e54f3d-0000-0000-86b7-b25c818528d1
	  Boot ID:                    021c5fd3-b441-490e-ac27-d927c00459f2
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-6r9s5                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         85s
	  kube-system                 etcd-ha-949000-m02                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         3m2s
	  kube-system                 kindnet-brtj6                            100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      3m4s
	  kube-system                 kube-apiserver-ha-949000-m02             250m (12%)    0 (0%)      0 (0%)           0 (0%)         3m2s
	  kube-system                 kube-controller-manager-ha-949000-m02    200m (10%)    0 (0%)      0 (0%)           0 (0%)         2m59s
	  kube-system                 kube-proxy-4r2bt                         0 (0%)        0 (0%)      0 (0%)           0 (0%)         3m4s
	  kube-system                 kube-scheduler-ha-949000-m02             100m (5%)     0 (0%)      0 (0%)           0 (0%)         2m58s
	  kube-system                 kube-vip-ha-949000-m02                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         3m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%)  100m (5%)
	  memory             150Mi (7%)  50Mi (2%)
	  ephemeral-storage  0 (0%)      0 (0%)
	  hugepages-2Mi      0 (0%)      0 (0%)
	Events:
	  Type    Reason                   Age                  From             Message
	  ----    ------                   ----                 ----             -------
	  Normal  Starting                 3m                   kube-proxy       
	  Normal  NodeHasSufficientMemory  3m4s (x8 over 3m4s)  kubelet          Node ha-949000-m02 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    3m4s (x8 over 3m4s)  kubelet          Node ha-949000-m02 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     3m4s (x7 over 3m4s)  kubelet          Node ha-949000-m02 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  3m4s                 kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           3m2s                 node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal  RegisteredNode           2m57s                node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal  RegisteredNode           107s                 node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	
	
	Name:               ha-949000-m03
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-949000-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=ha-949000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_08_31T15_31_53_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:31:50 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-949000-m03
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:33:42 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:32:52 +0000   Sat, 31 Aug 2024 22:31:50 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:32:52 +0000   Sat, 31 Aug 2024 22:31:50 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:32:52 +0000   Sat, 31 Aug 2024 22:31:50 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:32:52 +0000   Sat, 31 Aug 2024 22:32:13 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.7
	  Hostname:    ha-949000-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 0aea5b50957a40edad0152e71b7f3a2a
	  System UUID:                3fde4d5b-0000-0000-8412-6ae6e5c787bb
	  Boot ID:                    2d4c31ca-c268-4eb4-ad45-716d78aaaa5c
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.2.0/24
	PodCIDRs:                     10.244.2.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-vjf9x                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         85s
	  kube-system                 etcd-ha-949000-m03                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         112s
	  kube-system                 kindnet-9j85v                            100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      115s
	  kube-system                 kube-apiserver-ha-949000-m03             250m (12%)    0 (0%)      0 (0%)           0 (0%)         112s
	  kube-system                 kube-controller-manager-ha-949000-m03    200m (10%)    0 (0%)      0 (0%)           0 (0%)         114s
	  kube-system                 kube-proxy-d45q5                         0 (0%)        0 (0%)      0 (0%)           0 (0%)         115s
	  kube-system                 kube-scheduler-ha-949000-m03             100m (5%)     0 (0%)      0 (0%)           0 (0%)         114s
	  kube-system                 kube-vip-ha-949000-m03                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         111s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%)  100m (5%)
	  memory             150Mi (7%)  50Mi (2%)
	  ephemeral-storage  0 (0%)      0 (0%)
	  hugepages-2Mi      0 (0%)      0 (0%)
	Events:
	  Type    Reason                   Age                  From             Message
	  ----    ------                   ----                 ----             -------
	  Normal  Starting                 111s                 kube-proxy       
	  Normal  NodeHasSufficientMemory  115s (x8 over 115s)  kubelet          Node ha-949000-m03 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    115s (x8 over 115s)  kubelet          Node ha-949000-m03 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     115s (x7 over 115s)  kubelet          Node ha-949000-m03 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  115s                 kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           112s                 node-controller  Node ha-949000-m03 event: Registered Node ha-949000-m03 in Controller
	  Normal  RegisteredNode           112s                 node-controller  Node ha-949000-m03 event: Registered Node ha-949000-m03 in Controller
	  Normal  RegisteredNode           107s                 node-controller  Node ha-949000-m03 event: Registered Node ha-949000-m03 in Controller
	
	
	==> dmesg <==
	[  +2.774485] systemd-fstab-generator[127]: Ignoring "noauto" option for root device
	[  +2.237441] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000003] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000001] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +0.596627] systemd-fstab-generator[494]: Ignoring "noauto" option for root device
	[  +0.090743] systemd-fstab-generator[506]: Ignoring "noauto" option for root device
	[  +1.756564] systemd-fstab-generator[845]: Ignoring "noauto" option for root device
	[  +0.273405] systemd-fstab-generator[883]: Ignoring "noauto" option for root device
	[  +0.102089] systemd-fstab-generator[895]: Ignoring "noauto" option for root device
	[  +0.058959] kauditd_printk_skb: 115 callbacks suppressed
	[  +0.059797] systemd-fstab-generator[909]: Ignoring "noauto" option for root device
	[  +2.526421] systemd-fstab-generator[1125]: Ignoring "noauto" option for root device
	[  +0.100331] systemd-fstab-generator[1137]: Ignoring "noauto" option for root device
	[  +0.099114] systemd-fstab-generator[1149]: Ignoring "noauto" option for root device
	[  +0.141519] systemd-fstab-generator[1164]: Ignoring "noauto" option for root device
	[  +3.497423] systemd-fstab-generator[1265]: Ignoring "noauto" option for root device
	[  +0.066902] kauditd_printk_skb: 158 callbacks suppressed
	[  +2.572406] systemd-fstab-generator[1521]: Ignoring "noauto" option for root device
	[  +3.569896] systemd-fstab-generator[1651]: Ignoring "noauto" option for root device
	[  +0.054418] kauditd_printk_skb: 70 callbacks suppressed
	[  +7.004094] systemd-fstab-generator[2150]: Ignoring "noauto" option for root device
	[  +0.086539] kauditd_printk_skb: 72 callbacks suppressed
	[  +5.400345] kauditd_printk_skb: 12 callbacks suppressed
	[  +5.311598] kauditd_printk_skb: 29 callbacks suppressed
	[Aug31 22:30] kauditd_printk_skb: 26 callbacks suppressed
	
	
	==> etcd [c734c23a5308] <==
	{"level":"info","ts":"2024-08-31T22:30:42.586467Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:30:43.071231Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 switched to configuration voters=(3559962241544385584 13314548521573537860)"}
	{"level":"info","ts":"2024-08-31T22:30:43.071481Z","caller":"membership/cluster.go:535","msg":"promote member","cluster-id":"b73189effde9bc63","local-member-id":"b8c6c7563d17d844"}
	{"level":"info","ts":"2024-08-31T22:30:43.071678Z","caller":"etcdserver/server.go:1996","msg":"applied a configuration change through raft","local-member-id":"b8c6c7563d17d844","raft-conf-change":"ConfChangeAddNode","raft-conf-change-node-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:31:50.552948Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 switched to configuration voters=(3559962241544385584 13314548521573537860) learners=(485493211181035330)"}
	{"level":"info","ts":"2024-08-31T22:31:50.553563Z","caller":"membership/cluster.go:421","msg":"added member","cluster-id":"b73189effde9bc63","local-member-id":"b8c6c7563d17d844","added-peer-id":"6bcd180d94f2f42","added-peer-peer-urls":["https://192.169.0.7:2380"]}
	{"level":"info","ts":"2024-08-31T22:31:50.553811Z","caller":"rafthttp/peer.go:133","msg":"starting remote peer","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:31:50.553888Z","caller":"rafthttp/pipeline.go:72","msg":"started HTTP pipelining with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:31:50.563089Z","caller":"rafthttp/stream.go:169","msg":"started stream writer with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:31:50.563597Z","caller":"rafthttp/peer.go:137","msg":"started remote peer","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:31:50.563782Z","caller":"rafthttp/transport.go:317","msg":"added remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42","remote-peer-urls":["https://192.169.0.7:2380"]}
	{"level":"info","ts":"2024-08-31T22:31:50.563934Z","caller":"rafthttp/stream.go:169","msg":"started stream writer with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:31:50.564027Z","caller":"rafthttp/stream.go:395","msg":"started stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:31:50.564274Z","caller":"rafthttp/stream.go:395","msg":"started stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"warn","ts":"2024-08-31T22:31:51.592382Z","caller":"etcdhttp/peer.go:150","msg":"failed to promote a member","member-id":"6bcd180d94f2f42","error":"etcdserver: can only promote a learner member which is in sync with leader"}
	{"level":"info","ts":"2024-08-31T22:31:51.796182Z","caller":"rafthttp/peer_status.go:53","msg":"peer became active","peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:31:51.801097Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:31:51.801930Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:31:51.814490Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"6bcd180d94f2f42","stream-type":"stream MsgApp v2"}
	{"level":"info","ts":"2024-08-31T22:31:51.814527Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:31:51.822457Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"6bcd180d94f2f42","stream-type":"stream Message"}
	{"level":"info","ts":"2024-08-31T22:31:51.822549Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:31:52.588081Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 switched to configuration voters=(485493211181035330 3559962241544385584 13314548521573537860)"}
	{"level":"info","ts":"2024-08-31T22:31:52.588433Z","caller":"membership/cluster.go:535","msg":"promote member","cluster-id":"b73189effde9bc63","local-member-id":"b8c6c7563d17d844"}
	{"level":"info","ts":"2024-08-31T22:31:52.588653Z","caller":"etcdserver/server.go:1996","msg":"applied a configuration change through raft","local-member-id":"b8c6c7563d17d844","raft-conf-change":"ConfChangeAddNode","raft-conf-change-node-id":"6bcd180d94f2f42"}
	
	
	==> kernel <==
	 22:33:45 up 4 min,  0 users,  load average: 0.24, 0.18, 0.09
	Linux ha-949000 5.10.207 #1 SMP Wed Aug 28 20:54:17 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [6d156ce62611] <==
	I0831 22:33:05.622835       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:33:15.619953       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:33:15.620161       1 main.go:299] handling current node
	I0831 22:33:15.620244       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:33:15.620403       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:33:15.620694       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:33:15.620783       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:33:25.614304       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:33:25.614589       1 main.go:299] handling current node
	I0831 22:33:25.614804       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:33:25.615060       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:33:25.615515       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:33:25.615641       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:33:35.620070       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:33:35.620108       1 main.go:299] handling current node
	I0831 22:33:35.620119       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:33:35.620124       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:33:35.620269       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:33:35.620297       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:33:45.620982       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:33:45.621246       1 main.go:299] handling current node
	I0831 22:33:45.621372       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:33:45.621475       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:33:45.621703       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:33:45.621934       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	
	
	==> kube-apiserver [ffec6106be6c] <==
	I0831 22:29:42.351464       1 controller.go:615] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0831 22:29:42.447047       1 alloc.go:330] "allocated clusterIPs" service="default/kubernetes" clusterIPs={"IPv4":"10.96.0.1"}
	W0831 22:29:42.450860       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.169.0.5]
	I0831 22:29:42.451599       1 controller.go:615] quota admission added evaluator for: endpoints
	I0831 22:29:42.454145       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0831 22:29:43.117776       1 controller.go:615] quota admission added evaluator for: serviceaccounts
	I0831 22:29:44.628868       1 controller.go:615] quota admission added evaluator for: deployments.apps
	I0831 22:29:44.643482       1 alloc.go:330] "allocated clusterIPs" service="kube-system/kube-dns" clusterIPs={"IPv4":"10.96.0.10"}
	I0831 22:29:44.649286       1 controller.go:615] quota admission added evaluator for: daemonsets.apps
	I0831 22:29:48.568363       1 controller.go:615] quota admission added evaluator for: replicasets.apps
	I0831 22:29:48.768446       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	E0831 22:32:24.583976       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51190: use of closed network connection
	E0831 22:32:24.787019       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51192: use of closed network connection
	E0831 22:32:24.994355       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51194: use of closed network connection
	E0831 22:32:25.183977       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51196: use of closed network connection
	E0831 22:32:25.381277       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51198: use of closed network connection
	E0831 22:32:25.569952       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51200: use of closed network connection
	E0831 22:32:25.763008       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51202: use of closed network connection
	E0831 22:32:25.965367       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51204: use of closed network connection
	E0831 22:32:26.154701       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51206: use of closed network connection
	E0831 22:32:26.694309       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51211: use of closed network connection
	E0831 22:32:26.880399       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51213: use of closed network connection
	E0831 22:32:27.077320       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51215: use of closed network connection
	E0831 22:32:27.267610       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51217: use of closed network connection
	E0831 22:32:27.476005       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51219: use of closed network connection
	
	
	==> kube-controller-manager [6670fd34164c] <==
	I0831 22:31:58.309145       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:31:58.363553       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:32:00.655864       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:32:13.090917       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:32:13.100697       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:32:13.164123       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:32:20.074086       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="91.437594ms"
	I0831 22:32:20.089117       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="14.696904ms"
	I0831 22:32:20.155832       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="66.417676ms"
	I0831 22:32:20.247938       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="91.617712ms"
	E0831 22:32:20.248480       1 replica_set.go:560] "Unhandled Error" err="sync \"default/busybox-7dff88458\" failed with Operation cannot be fulfilled on replicasets.apps \"busybox-7dff88458\": the object has been modified; please apply your changes to the latest version and try again" logger="UnhandledError"
	I0831 22:32:20.257744       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="7.890782ms"
	I0831 22:32:20.258053       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="29.491µs"
	I0831 22:32:20.352807       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="29.639µs"
	I0831 22:32:21.164054       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:32:21.310383       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="34.795µs"
	I0831 22:32:22.115926       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="5.066721ms"
	I0831 22:32:22.116004       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="26.449µs"
	I0831 22:32:23.502335       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="6.289855ms"
	I0831 22:32:23.502432       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="58.061µs"
	I0831 22:32:24.043757       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="4.626106ms"
	I0831 22:32:24.044703       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="46.785µs"
	I0831 22:32:44.005602       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m02"
	I0831 22:32:48.178405       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000"
	I0831 22:32:52.115444       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	
	
	==> kube-proxy [54d5f8041c89] <==
		add table ip kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	E0831 22:29:49.977338       1 proxier.go:734] "Error cleaning up nftables rules" err=<
		could not run nftables command: /dev/stdin:1:1-25: Error: Could not process rule: Operation not supported
		add table ip6 kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	I0831 22:29:49.983071       1 server.go:677] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	E0831 22:29:49.983430       1 server.go:234] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I0831 22:29:50.023032       1 server_linux.go:146] "No iptables support for family" ipFamily="IPv6"
	I0831 22:29:50.023054       1 server.go:245] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0831 22:29:50.023070       1 server_linux.go:169] "Using iptables Proxier"
	I0831 22:29:50.025790       1 proxier.go:255] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I0831 22:29:50.026014       1 server.go:483] "Version info" version="v1.31.0"
	I0831 22:29:50.026061       1 server.go:485] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:29:50.026844       1 config.go:197] "Starting service config controller"
	I0831 22:29:50.027602       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0831 22:29:50.027141       1 config.go:104] "Starting endpoint slice config controller"
	I0831 22:29:50.027698       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0831 22:29:50.027260       1 config.go:326] "Starting node config controller"
	I0831 22:29:50.027720       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0831 22:29:50.128122       1 shared_informer.go:320] Caches are synced for node config
	I0831 22:29:50.128144       1 shared_informer.go:320] Caches are synced for service config
	I0831 22:29:50.128162       1 shared_informer.go:320] Caches are synced for endpoint slice config
	
	
	==> kube-scheduler [02c10e4f765d] <==
	W0831 22:29:42.107023       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0831 22:29:42.107231       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicasets\" in API group \"apps\" at the cluster scope" logger="UnhandledError"
	W0831 22:29:42.111966       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	E0831 22:29:42.112045       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumes\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:29:42.116498       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	E0831 22:29:42.116539       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Namespace: failed to list *v1.Namespace: namespaces is forbidden: User \"system:kube-scheduler\" cannot list resource \"namespaces\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:29:42.129701       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0831 22:29:42.129741       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumeclaims\" in API group \"\" at the cluster scope" logger="UnhandledError"
	I0831 22:29:45.342252       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	E0831 22:31:50.464567       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kube-proxy-d45q5\": pod kube-proxy-d45q5 is already assigned to node \"ha-949000-m03\"" plugin="DefaultBinder" pod="kube-system/kube-proxy-d45q5" node="ha-949000-m03"
	E0831 22:31:50.464652       1 schedule_one.go:348] "scheduler cache ForgetPod failed" err="pod 9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2(kube-system/kube-proxy-d45q5) wasn't assumed so cannot be forgotten" pod="kube-system/kube-proxy-d45q5"
	E0831 22:31:50.464667       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kube-proxy-d45q5\": pod kube-proxy-d45q5 is already assigned to node \"ha-949000-m03\"" pod="kube-system/kube-proxy-d45q5"
	I0831 22:31:50.464683       1 schedule_one.go:1070] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kube-proxy-d45q5" node="ha-949000-m03"
	E0831 22:31:50.476710       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kindnet-l4zbh\": pod kindnet-l4zbh is already assigned to node \"ha-949000-m03\"" plugin="DefaultBinder" pod="kube-system/kindnet-l4zbh" node="ha-949000-m03"
	E0831 22:31:50.476756       1 schedule_one.go:348] "scheduler cache ForgetPod failed" err="pod c551bb18-9a7d-4fca-9724-be7900980a40(kube-system/kindnet-l4zbh) wasn't assumed so cannot be forgotten" pod="kube-system/kindnet-l4zbh"
	E0831 22:31:50.476767       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kindnet-l4zbh\": pod kindnet-l4zbh is already assigned to node \"ha-949000-m03\"" pod="kube-system/kindnet-l4zbh"
	I0831 22:31:50.476781       1 schedule_one.go:1070] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kindnet-l4zbh" node="ha-949000-m03"
	E0831 22:32:20.049491       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-6r9s5\": pod busybox-7dff88458-6r9s5 is already assigned to node \"ha-949000-m02\"" plugin="DefaultBinder" pod="default/busybox-7dff88458-6r9s5" node="ha-949000-m02"
	E0831 22:32:20.049618       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-6r9s5\": pod busybox-7dff88458-6r9s5 is already assigned to node \"ha-949000-m02\"" pod="default/busybox-7dff88458-6r9s5"
	E0831 22:32:20.071235       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-vjf9x\": pod busybox-7dff88458-vjf9x is already assigned to node \"ha-949000-m03\"" plugin="DefaultBinder" pod="default/busybox-7dff88458-vjf9x" node="ha-949000-m03"
	E0831 22:32:20.071466       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-vjf9x\": pod busybox-7dff88458-vjf9x is already assigned to node \"ha-949000-m03\"" pod="default/busybox-7dff88458-vjf9x"
	E0831 22:32:20.073498       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-5kkbw\": pod busybox-7dff88458-5kkbw is already assigned to node \"ha-949000\"" plugin="DefaultBinder" pod="default/busybox-7dff88458-5kkbw" node="ha-949000"
	E0831 22:32:20.073571       1 schedule_one.go:348] "scheduler cache ForgetPod failed" err="pod e97e21d8-a69e-451c-babd-6232e12aafe0(default/busybox-7dff88458-5kkbw) wasn't assumed so cannot be forgotten" pod="default/busybox-7dff88458-5kkbw"
	E0831 22:32:20.077323       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-5kkbw\": pod busybox-7dff88458-5kkbw is already assigned to node \"ha-949000\"" pod="default/busybox-7dff88458-5kkbw"
	I0831 22:32:20.077394       1 schedule_one.go:1070] "Pod has been assigned to node. Abort adding it back to queue." pod="default/busybox-7dff88458-5kkbw" node="ha-949000"
	
	
	==> kubelet <==
	Aug 31 22:30:08 ha-949000 kubelet[2157]: I0831 22:30:08.742452    2157 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-snq8s" podStartSLOduration=19.742440453 podStartE2EDuration="19.742440453s" podCreationTimestamp="2024-08-31 22:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-08-31 22:30:08.742201936 +0000 UTC m=+24.362226027" watchObservedRunningTime="2024-08-31 22:30:08.742440453 +0000 UTC m=+24.362464538"
	Aug 31 22:30:08 ha-949000 kubelet[2157]: I0831 22:30:08.742651    2157 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/storage-provisioner" podStartSLOduration=20.742642621999998 podStartE2EDuration="20.742642622s" podCreationTimestamp="2024-08-31 22:29:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-08-31 22:30:08.732189424 +0000 UTC m=+24.352213514" watchObservedRunningTime="2024-08-31 22:30:08.742642622 +0000 UTC m=+24.362666707"
	Aug 31 22:30:44 ha-949000 kubelet[2157]: E0831 22:30:44.495173    2157 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:30:44 ha-949000 kubelet[2157]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:30:44 ha-949000 kubelet[2157]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:30:44 ha-949000 kubelet[2157]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:30:44 ha-949000 kubelet[2157]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:31:44 ha-949000 kubelet[2157]: E0831 22:31:44.490275    2157 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:31:44 ha-949000 kubelet[2157]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:31:44 ha-949000 kubelet[2157]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:31:44 ha-949000 kubelet[2157]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:31:44 ha-949000 kubelet[2157]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:32:20 ha-949000 kubelet[2157]: W0831 22:32:20.081132    2157 reflector.go:561] object-"default"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ha-949000" cannot list resource "configmaps" in API group "" in the namespace "default": no relationship found between node 'ha-949000' and this object
	Aug 31 22:32:20 ha-949000 kubelet[2157]: E0831 22:32:20.081252    2157 reflector.go:158] "Unhandled Error" err="object-\"default\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ha-949000\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"default\": no relationship found between node 'ha-949000' and this object" logger="UnhandledError"
	Aug 31 22:32:20 ha-949000 kubelet[2157]: I0831 22:32:20.223174    2157 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l95k\" (UniqueName: \"kubernetes.io/projected/e97e21d8-a69e-451c-babd-6232e12aafe0-kube-api-access-6l95k\") pod \"busybox-7dff88458-5kkbw\" (UID: \"e97e21d8-a69e-451c-babd-6232e12aafe0\") " pod="default/busybox-7dff88458-5kkbw"
	Aug 31 22:32:44 ha-949000 kubelet[2157]: E0831 22:32:44.489812    2157 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:32:44 ha-949000 kubelet[2157]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:32:44 ha-949000 kubelet[2157]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:32:44 ha-949000 kubelet[2157]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:32:44 ha-949000 kubelet[2157]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:33:44 ha-949000 kubelet[2157]: E0831 22:33:44.492393    2157 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:33:44 ha-949000 kubelet[2157]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:33:44 ha-949000 kubelet[2157]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:33:44 ha-949000 kubelet[2157]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:33:44 ha-949000 kubelet[2157]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:255: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p ha-949000 -n ha-949000
helpers_test.go:262: (dbg) Run:  kubectl --context ha-949000 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:286: <<< TestMultiControlPlane/serial/AddWorkerNode FAILED: end of post-mortem logs <<<
helpers_test.go:287: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/AddWorkerNode (79.65s)

                                                
                                    
x
+
TestMultiControlPlane/serial/CopyFile (3.42s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/CopyFile
ha_test.go:326: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 status --output json -v=7 --alsologtostderr
ha_test.go:326: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-949000 status --output json -v=7 --alsologtostderr: exit status 2 (450.600023ms)

                                                
                                                
-- stdout --
	[{"Name":"ha-949000","Host":"Running","Kubelet":"Running","APIServer":"Running","Kubeconfig":"Configured","Worker":false},{"Name":"ha-949000-m02","Host":"Running","Kubelet":"Running","APIServer":"Running","Kubeconfig":"Configured","Worker":false},{"Name":"ha-949000-m03","Host":"Running","Kubelet":"Running","APIServer":"Running","Kubeconfig":"Configured","Worker":false},{"Name":"ha-949000-m04","Host":"Running","Kubelet":"Stopped","APIServer":"Irrelevant","Kubeconfig":"Irrelevant","Worker":true}]

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 15:33:47.447118    3447 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:33:47.447899    3447 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:33:47.447907    3447 out.go:358] Setting ErrFile to fd 2...
	I0831 15:33:47.447914    3447 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:33:47.448485    3447 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:33:47.448679    3447 out.go:352] Setting JSON to true
	I0831 15:33:47.448702    3447 mustload.go:65] Loading cluster: ha-949000
	I0831 15:33:47.448730    3447 notify.go:220] Checking for updates...
	I0831 15:33:47.448995    3447 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:33:47.449011    3447 status.go:255] checking status of ha-949000 ...
	I0831 15:33:47.449342    3447 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:33:47.449390    3447 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:33:47.458333    3447 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51302
	I0831 15:33:47.458666    3447 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:33:47.459063    3447 main.go:141] libmachine: Using API Version  1
	I0831 15:33:47.459072    3447 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:33:47.459289    3447 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:33:47.459399    3447 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:33:47.459479    3447 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:33:47.459547    3447 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:33:47.460558    3447 status.go:330] ha-949000 host status = "Running" (err=<nil>)
	I0831 15:33:47.460585    3447 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:33:47.460855    3447 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:33:47.460899    3447 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:33:47.469219    3447 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51304
	I0831 15:33:47.469557    3447 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:33:47.469924    3447 main.go:141] libmachine: Using API Version  1
	I0831 15:33:47.469941    3447 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:33:47.470169    3447 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:33:47.470277    3447 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:33:47.470355    3447 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:33:47.470599    3447 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:33:47.470621    3447 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:33:47.479169    3447 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51306
	I0831 15:33:47.479468    3447 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:33:47.479824    3447 main.go:141] libmachine: Using API Version  1
	I0831 15:33:47.479841    3447 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:33:47.480026    3447 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:33:47.480133    3447 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:33:47.480274    3447 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:33:47.480293    3447 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:33:47.480373    3447 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:33:47.480452    3447 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:33:47.480528    3447 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:33:47.480602    3447 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:33:47.518115    3447 ssh_runner.go:195] Run: systemctl --version
	I0831 15:33:47.522437    3447 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:33:47.533656    3447 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:33:47.533684    3447 api_server.go:166] Checking apiserver status ...
	I0831 15:33:47.533733    3447 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:33:47.546857    3447 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup
	W0831 15:33:47.554859    3447 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:33:47.554905    3447 ssh_runner.go:195] Run: ls
	I0831 15:33:47.558030    3447 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:33:47.561014    3447 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:33:47.561026    3447 status.go:422] ha-949000 apiserver status = Running (err=<nil>)
	I0831 15:33:47.561035    3447 status.go:257] ha-949000 status: &{Name:ha-949000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:33:47.561046    3447 status.go:255] checking status of ha-949000-m02 ...
	I0831 15:33:47.561303    3447 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:33:47.561332    3447 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:33:47.569921    3447 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51310
	I0831 15:33:47.570266    3447 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:33:47.570606    3447 main.go:141] libmachine: Using API Version  1
	I0831 15:33:47.570614    3447 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:33:47.570839    3447 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:33:47.570948    3447 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:33:47.571029    3447 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:33:47.571108    3447 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:33:47.572101    3447 status.go:330] ha-949000-m02 host status = "Running" (err=<nil>)
	I0831 15:33:47.572109    3447 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:33:47.572350    3447 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:33:47.572372    3447 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:33:47.581281    3447 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51312
	I0831 15:33:47.581615    3447 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:33:47.581976    3447 main.go:141] libmachine: Using API Version  1
	I0831 15:33:47.581992    3447 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:33:47.582231    3447 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:33:47.582355    3447 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:33:47.582453    3447 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:33:47.582719    3447 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:33:47.582756    3447 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:33:47.591528    3447 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51314
	I0831 15:33:47.591873    3447 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:33:47.592186    3447 main.go:141] libmachine: Using API Version  1
	I0831 15:33:47.592194    3447 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:33:47.592403    3447 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:33:47.592515    3447 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:33:47.592652    3447 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:33:47.592664    3447 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:33:47.592739    3447 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:33:47.592835    3447 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:33:47.592919    3447 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:33:47.593025    3447 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:33:47.629976    3447 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:33:47.643578    3447 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:33:47.643598    3447 api_server.go:166] Checking apiserver status ...
	I0831 15:33:47.643646    3447 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:33:47.655176    3447 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1935/cgroup
	W0831 15:33:47.662332    3447 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1935/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:33:47.662375    3447 ssh_runner.go:195] Run: ls
	I0831 15:33:47.665949    3447 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:33:47.668977    3447 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:33:47.668989    3447 status.go:422] ha-949000-m02 apiserver status = Running (err=<nil>)
	I0831 15:33:47.668997    3447 status.go:257] ha-949000-m02 status: &{Name:ha-949000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:33:47.669006    3447 status.go:255] checking status of ha-949000-m03 ...
	I0831 15:33:47.669267    3447 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:33:47.669287    3447 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:33:47.678003    3447 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51318
	I0831 15:33:47.678341    3447 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:33:47.678669    3447 main.go:141] libmachine: Using API Version  1
	I0831 15:33:47.678679    3447 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:33:47.678881    3447 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:33:47.678987    3447 main.go:141] libmachine: (ha-949000-m03) Calling .GetState
	I0831 15:33:47.679073    3447 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:33:47.679160    3447 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:33:47.680166    3447 status.go:330] ha-949000-m03 host status = "Running" (err=<nil>)
	I0831 15:33:47.680173    3447 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:33:47.680426    3447 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:33:47.680456    3447 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:33:47.689185    3447 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51320
	I0831 15:33:47.689528    3447 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:33:47.689836    3447 main.go:141] libmachine: Using API Version  1
	I0831 15:33:47.689847    3447 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:33:47.690074    3447 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:33:47.690181    3447 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:33:47.690276    3447 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:33:47.690581    3447 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:33:47.690608    3447 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:33:47.699212    3447 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51322
	I0831 15:33:47.699544    3447 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:33:47.699887    3447 main.go:141] libmachine: Using API Version  1
	I0831 15:33:47.699902    3447 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:33:47.700131    3447 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:33:47.700240    3447 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:33:47.700362    3447 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:33:47.700374    3447 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:33:47.700453    3447 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:33:47.700532    3447 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:33:47.700627    3447 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:33:47.700713    3447 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:33:47.728452    3447 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:33:47.738724    3447 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:33:47.738738    3447 api_server.go:166] Checking apiserver status ...
	I0831 15:33:47.738776    3447 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:33:47.749848    3447 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1944/cgroup
	W0831 15:33:47.756899    3447 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1944/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:33:47.756938    3447 ssh_runner.go:195] Run: ls
	I0831 15:33:47.759913    3447 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:33:47.762986    3447 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:33:47.762997    3447 status.go:422] ha-949000-m03 apiserver status = Running (err=<nil>)
	I0831 15:33:47.763005    3447 status.go:257] ha-949000-m03 status: &{Name:ha-949000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:33:47.763015    3447 status.go:255] checking status of ha-949000-m04 ...
	I0831 15:33:47.763260    3447 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:33:47.763280    3447 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:33:47.772036    3447 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51326
	I0831 15:33:47.772381    3447 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:33:47.772704    3447 main.go:141] libmachine: Using API Version  1
	I0831 15:33:47.772717    3447 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:33:47.772950    3447 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:33:47.773065    3447 main.go:141] libmachine: (ha-949000-m04) Calling .GetState
	I0831 15:33:47.773153    3447 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:33:47.773232    3447 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3377
	I0831 15:33:47.774216    3447 status.go:330] ha-949000-m04 host status = "Running" (err=<nil>)
	I0831 15:33:47.774226    3447 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:33:47.774485    3447 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:33:47.774509    3447 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:33:47.782984    3447 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51328
	I0831 15:33:47.783331    3447 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:33:47.783699    3447 main.go:141] libmachine: Using API Version  1
	I0831 15:33:47.783718    3447 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:33:47.783940    3447 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:33:47.784053    3447 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:33:47.784135    3447 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:33:47.784408    3447 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:33:47.784432    3447 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:33:47.792967    3447 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51330
	I0831 15:33:47.793304    3447 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:33:47.793676    3447 main.go:141] libmachine: Using API Version  1
	I0831 15:33:47.793693    3447 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:33:47.793923    3447 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:33:47.794037    3447 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:33:47.794181    3447 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:33:47.794193    3447 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:33:47.794277    3447 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:33:47.794359    3447 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:33:47.794446    3447 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:33:47.794523    3447 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:33:47.830186    3447 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:33:47.841706    3447 status.go:257] ha-949000-m04 status: &{Name:ha-949000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:328: failed to run minikube status. args "out/minikube-darwin-amd64 -p ha-949000 status --output json -v=7 --alsologtostderr" : exit status 2
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-949000 -n ha-949000
helpers_test.go:245: <<< TestMultiControlPlane/serial/CopyFile FAILED: start of post-mortem logs <<<
helpers_test.go:246: ======>  post-mortem[TestMultiControlPlane/serial/CopyFile]: minikube logs <======
helpers_test.go:248: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 logs -n 25
helpers_test.go:248: (dbg) Done: out/minikube-darwin-amd64 -p ha-949000 logs -n 25: (2.360813207s)
helpers_test.go:253: TestMultiControlPlane/serial/CopyFile logs: 
-- stdout --
	
	==> Audit <==
	|----------------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	|    Command     |                 Args                 |      Profile      |  User   | Version |     Start Time      |      End Time       |
	|----------------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	| update-context | functional-593000                    | functional-593000 | jenkins | v1.33.1 | 31 Aug 24 15:28 PDT | 31 Aug 24 15:28 PDT |
	|                | update-context                       |                   |         |         |                     |                     |
	|                | --alsologtostderr -v=2               |                   |         |         |                     |                     |
	| update-context | functional-593000                    | functional-593000 | jenkins | v1.33.1 | 31 Aug 24 15:28 PDT | 31 Aug 24 15:28 PDT |
	|                | update-context                       |                   |         |         |                     |                     |
	|                | --alsologtostderr -v=2               |                   |         |         |                     |                     |
	| delete         | -p functional-593000                 | functional-593000 | jenkins | v1.33.1 | 31 Aug 24 15:29 PDT | 31 Aug 24 15:29 PDT |
	| start          | -p ha-949000 --wait=true             | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:29 PDT | 31 Aug 24 15:32 PDT |
	|                | --memory=2200 --ha                   |                   |         |         |                     |                     |
	|                | -v=7 --alsologtostderr               |                   |         |         |                     |                     |
	|                | --driver=hyperkit                    |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- apply -f             | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | ./testdata/ha/ha-pod-dns-test.yaml   |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- rollout status       | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | deployment/busybox                   |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- get pods -o          | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | jsonpath='{.items[*].status.podIP}'  |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- get pods -o          | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | jsonpath='{.items[*].metadata.name}' |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-5kkbw --           |                   |         |         |                     |                     |
	|                | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-6r9s5 --           |                   |         |         |                     |                     |
	|                | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-vjf9x --           |                   |         |         |                     |                     |
	|                | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-5kkbw --           |                   |         |         |                     |                     |
	|                | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-6r9s5 --           |                   |         |         |                     |                     |
	|                | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-vjf9x --           |                   |         |         |                     |                     |
	|                | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-5kkbw -- nslookup  |                   |         |         |                     |                     |
	|                | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-6r9s5 -- nslookup  |                   |         |         |                     |                     |
	|                | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-vjf9x -- nslookup  |                   |         |         |                     |                     |
	|                | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- get pods -o          | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | jsonpath='{.items[*].metadata.name}' |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-5kkbw              |                   |         |         |                     |                     |
	|                | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|                | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|                | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-5kkbw -- sh        |                   |         |         |                     |                     |
	|                | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-6r9s5              |                   |         |         |                     |                     |
	|                | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|                | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|                | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-6r9s5 -- sh        |                   |         |         |                     |                     |
	|                | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-vjf9x              |                   |         |         |                     |                     |
	|                | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|                | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|                | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-vjf9x -- sh        |                   |         |         |                     |                     |
	|                | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| node           | add -p ha-949000 -v=7                | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT |                     |
	|                | --alsologtostderr                    |                   |         |         |                     |                     |
	|----------------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/08/31 15:29:09
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0831 15:29:09.276641    2876 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:29:09.276909    2876 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:29:09.276915    2876 out.go:358] Setting ErrFile to fd 2...
	I0831 15:29:09.276919    2876 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:29:09.277077    2876 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:29:09.278657    2876 out.go:352] Setting JSON to false
	I0831 15:29:09.304076    2876 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":1720,"bootTime":1725141629,"procs":442,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.6.1","kernelVersion":"23.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0831 15:29:09.304206    2876 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0831 15:29:09.363205    2876 out.go:177] * [ha-949000] minikube v1.33.1 on Darwin 14.6.1
	I0831 15:29:09.404287    2876 notify.go:220] Checking for updates...
	I0831 15:29:09.428120    2876 out.go:177]   - MINIKUBE_LOCATION=18943
	I0831 15:29:09.489040    2876 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:29:09.566857    2876 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0831 15:29:09.611464    2876 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0831 15:29:09.632356    2876 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:29:09.653358    2876 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0831 15:29:09.674652    2876 driver.go:392] Setting default libvirt URI to qemu:///system
	I0831 15:29:09.704277    2876 out.go:177] * Using the hyperkit driver based on user configuration
	I0831 15:29:09.746520    2876 start.go:297] selected driver: hyperkit
	I0831 15:29:09.746549    2876 start.go:901] validating driver "hyperkit" against <nil>
	I0831 15:29:09.746572    2876 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0831 15:29:09.750947    2876 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:29:09.751059    2876 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18943-957/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0831 15:29:09.759462    2876 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0831 15:29:09.763334    2876 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:09.763355    2876 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0831 15:29:09.763386    2876 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0831 15:29:09.763603    2876 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:29:09.763661    2876 cni.go:84] Creating CNI manager for ""
	I0831 15:29:09.763670    2876 cni.go:136] multinode detected (0 nodes found), recommending kindnet
	I0831 15:29:09.763676    2876 start_flags.go:319] Found "CNI" CNI - setting NetworkPlugin=cni
	I0831 15:29:09.763757    2876 start.go:340] cluster config:
	{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docke
r CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0
GPUs: AutoPauseInterval:1m0s}
	I0831 15:29:09.763847    2876 iso.go:125] acquiring lock: {Name:mk6e91575b208577856769ef01f8e000bc57c787 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:29:09.806188    2876 out.go:177] * Starting "ha-949000" primary control-plane node in "ha-949000" cluster
	I0831 15:29:09.827330    2876 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:29:09.827400    2876 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4
	I0831 15:29:09.827429    2876 cache.go:56] Caching tarball of preloaded images
	I0831 15:29:09.827640    2876 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:29:09.827663    2876 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:29:09.828200    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:29:09.828242    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json: {Name:mka3af2c42dba1cbf0f487cd55ddf735793024ce Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:09.828849    2876 start.go:360] acquireMachinesLock for ha-949000: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:29:09.828952    2876 start.go:364] duration metric: took 84.577µs to acquireMachinesLock for "ha-949000"
	I0831 15:29:09.828988    2876 start.go:93] Provisioning new machine with config: &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType
:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:29:09.829059    2876 start.go:125] createHost starting for "" (driver="hyperkit")
	I0831 15:29:09.903354    2876 out.go:235] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0831 15:29:09.903628    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:09.903698    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:09.913643    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51029
	I0831 15:29:09.913991    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:09.914387    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:09.914395    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:09.914636    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:09.914768    2876 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:29:09.914873    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:09.915000    2876 start.go:159] libmachine.API.Create for "ha-949000" (driver="hyperkit")
	I0831 15:29:09.915023    2876 client.go:168] LocalClient.Create starting
	I0831 15:29:09.915061    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem
	I0831 15:29:09.915112    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:29:09.915129    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:29:09.915188    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem
	I0831 15:29:09.915229    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:29:09.915249    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:29:09.915265    2876 main.go:141] libmachine: Running pre-create checks...
	I0831 15:29:09.915270    2876 main.go:141] libmachine: (ha-949000) Calling .PreCreateCheck
	I0831 15:29:09.915359    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:09.915528    2876 main.go:141] libmachine: (ha-949000) Calling .GetConfigRaw
	I0831 15:29:09.915949    2876 main.go:141] libmachine: Creating machine...
	I0831 15:29:09.915958    2876 main.go:141] libmachine: (ha-949000) Calling .Create
	I0831 15:29:09.916028    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:09.916144    2876 main.go:141] libmachine: (ha-949000) DBG | I0831 15:29:09.916024    2884 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:29:09.916224    2876 main.go:141] libmachine: (ha-949000) Downloading /Users/jenkins/minikube-integration/18943-957/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso...
	I0831 15:29:10.099863    2876 main.go:141] libmachine: (ha-949000) DBG | I0831 15:29:10.099790    2884 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa...
	I0831 15:29:10.256390    2876 main.go:141] libmachine: (ha-949000) DBG | I0831 15:29:10.256317    2884 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk...
	I0831 15:29:10.256437    2876 main.go:141] libmachine: (ha-949000) DBG | Writing magic tar header
	I0831 15:29:10.256445    2876 main.go:141] libmachine: (ha-949000) DBG | Writing SSH key tar header
	I0831 15:29:10.257253    2876 main.go:141] libmachine: (ha-949000) DBG | I0831 15:29:10.257126    2884 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000 ...
	I0831 15:29:10.614937    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:10.614967    2876 main.go:141] libmachine: (ha-949000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid
	I0831 15:29:10.615070    2876 main.go:141] libmachine: (ha-949000) DBG | Using UUID 98cab9ba-901d-49d1-9e6c-321a4533d56e
	I0831 15:29:10.724629    2876 main.go:141] libmachine: (ha-949000) DBG | Generated MAC ce:8:77:f7:42:5e
	I0831 15:29:10.724653    2876 main.go:141] libmachine: (ha-949000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:29:10.724744    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"98cab9ba-901d-49d1-9e6c-321a4533d56e", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001ae630)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:29:10.724785    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"98cab9ba-901d-49d1-9e6c-321a4533d56e", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001ae630)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:29:10.724823    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "98cab9ba-901d-49d1-9e6c-321a4533d56e", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd,earlyprintk=serial l
oglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:29:10.724851    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 98cab9ba-901d-49d1-9e6c-321a4533d56e -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset noresto
re waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:29:10.724862    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:29:10.727687    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: Pid is 2887
	I0831 15:29:10.728136    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 0
	I0831 15:29:10.728145    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:10.728201    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:10.729180    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:10.729276    2876 main.go:141] libmachine: (ha-949000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0831 15:29:10.729293    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:10.729309    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:10.729317    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:10.735289    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:29:10.788351    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:29:10.788955    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:29:10.788972    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:29:10.788980    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:29:10.788989    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:29:11.164652    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:29:11.164668    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:29:11.279214    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:29:11.279233    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:29:11.279245    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:29:11.279263    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:29:11.280165    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:29:11.280176    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:29:12.729552    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 1
	I0831 15:29:12.729568    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:12.729694    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:12.730495    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:12.730552    2876 main.go:141] libmachine: (ha-949000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0831 15:29:12.730566    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:12.730580    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:12.730595    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:14.731472    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 2
	I0831 15:29:14.731486    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:14.731548    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:14.732412    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:14.732458    2876 main.go:141] libmachine: (ha-949000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0831 15:29:14.732473    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:14.732492    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:14.732506    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:16.732786    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 3
	I0831 15:29:16.732802    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:16.732855    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:16.733685    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:16.733713    2876 main.go:141] libmachine: (ha-949000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0831 15:29:16.733721    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:16.733748    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:16.733759    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:16.839902    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:16 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:29:16.839946    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:16 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:29:16.839959    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:16 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:29:16.864989    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:16 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:29:18.735154    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 4
	I0831 15:29:18.735170    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:18.735286    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:18.736038    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:18.736084    2876 main.go:141] libmachine: (ha-949000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0831 15:29:18.736094    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:18.736103    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:18.736112    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:20.736683    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 5
	I0831 15:29:20.736698    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:20.736791    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:20.737588    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:20.737620    2876 main.go:141] libmachine: (ha-949000) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:20.737633    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:20.737640    2876 main.go:141] libmachine: (ha-949000) DBG | Found match: ce:8:77:f7:42:5e
	I0831 15:29:20.737645    2876 main.go:141] libmachine: (ha-949000) DBG | IP: 192.169.0.5
	I0831 15:29:20.737694    2876 main.go:141] libmachine: (ha-949000) Calling .GetConfigRaw
	I0831 15:29:20.738300    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:20.738400    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:20.738493    2876 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0831 15:29:20.738503    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:29:20.738582    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:20.738639    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:20.739400    2876 main.go:141] libmachine: Detecting operating system of created instance...
	I0831 15:29:20.739409    2876 main.go:141] libmachine: Waiting for SSH to be available...
	I0831 15:29:20.739415    2876 main.go:141] libmachine: Getting to WaitForSSH function...
	I0831 15:29:20.739420    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:20.739500    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:20.739608    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:20.739694    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:20.739784    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:20.739906    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:20.740082    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:20.740088    2876 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0831 15:29:21.810169    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:29:21.810183    2876 main.go:141] libmachine: Detecting the provisioner...
	I0831 15:29:21.810190    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:21.810319    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:21.810409    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.810520    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.810622    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:21.810753    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:21.810899    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:21.810907    2876 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0831 15:29:21.876064    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0831 15:29:21.876103    2876 main.go:141] libmachine: found compatible host: buildroot
	I0831 15:29:21.876110    2876 main.go:141] libmachine: Provisioning with buildroot...
	I0831 15:29:21.876116    2876 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:29:21.876252    2876 buildroot.go:166] provisioning hostname "ha-949000"
	I0831 15:29:21.876263    2876 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:29:21.876353    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:21.876438    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:21.876542    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.876625    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.876705    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:21.876835    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:21.876977    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:21.876986    2876 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000 && echo "ha-949000" | sudo tee /etc/hostname
	I0831 15:29:21.955731    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000
	
	I0831 15:29:21.955752    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:21.955889    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:21.955998    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.956098    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.956196    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:21.956332    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:21.956482    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:21.956494    2876 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:29:22.031652    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:29:22.031674    2876 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:29:22.031695    2876 buildroot.go:174] setting up certificates
	I0831 15:29:22.031704    2876 provision.go:84] configureAuth start
	I0831 15:29:22.031711    2876 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:29:22.031840    2876 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:29:22.031922    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:22.032006    2876 provision.go:143] copyHostCerts
	I0831 15:29:22.032046    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:29:22.032109    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:29:22.032118    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:29:22.032257    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:29:22.032465    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:29:22.032502    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:29:22.032507    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:29:22.032592    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:29:22.032752    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:29:22.032790    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:29:22.032795    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:29:22.032874    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:29:22.033015    2876 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000 san=[127.0.0.1 192.169.0.5 ha-949000 localhost minikube]
	I0831 15:29:22.113278    2876 provision.go:177] copyRemoteCerts
	I0831 15:29:22.113334    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:29:22.113349    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:22.113477    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:22.113572    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.113653    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:22.113746    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:22.153055    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:29:22.153132    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:29:22.173186    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:29:22.173254    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1196 bytes)
	I0831 15:29:22.192526    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:29:22.192581    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0831 15:29:22.212150    2876 provision.go:87] duration metric: took 180.428736ms to configureAuth
	I0831 15:29:22.212163    2876 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:29:22.212301    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:29:22.212314    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:22.212441    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:22.212522    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:22.212600    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.212680    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.212760    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:22.212882    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:22.213008    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:22.213015    2876 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:29:22.281023    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:29:22.281035    2876 buildroot.go:70] root file system type: tmpfs
	I0831 15:29:22.281108    2876 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:29:22.281121    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:22.281265    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:22.281355    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.281474    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.281559    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:22.281695    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:22.281836    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:22.281881    2876 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:29:22.358523    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:29:22.358550    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:22.358687    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:22.358785    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.358873    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.358967    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:22.359137    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:22.359281    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:22.359293    2876 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:29:23.900860    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:29:23.900883    2876 main.go:141] libmachine: Checking connection to Docker...
	I0831 15:29:23.900890    2876 main.go:141] libmachine: (ha-949000) Calling .GetURL
	I0831 15:29:23.901027    2876 main.go:141] libmachine: Docker is up and running!
	I0831 15:29:23.901035    2876 main.go:141] libmachine: Reticulating splines...
	I0831 15:29:23.901040    2876 client.go:171] duration metric: took 13.985813631s to LocalClient.Create
	I0831 15:29:23.901051    2876 start.go:167] duration metric: took 13.985855387s to libmachine.API.Create "ha-949000"
	I0831 15:29:23.901061    2876 start.go:293] postStartSetup for "ha-949000" (driver="hyperkit")
	I0831 15:29:23.901070    2876 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:29:23.901080    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:23.901239    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:29:23.901251    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:23.901337    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:23.901438    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:23.901525    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:23.901622    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:23.947237    2876 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:29:23.951946    2876 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:29:23.951965    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:29:23.952069    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:29:23.952248    2876 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:29:23.952255    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:29:23.952462    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:29:23.961814    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:29:23.990864    2876 start.go:296] duration metric: took 89.791408ms for postStartSetup
	I0831 15:29:23.990895    2876 main.go:141] libmachine: (ha-949000) Calling .GetConfigRaw
	I0831 15:29:23.991499    2876 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:29:23.991642    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:29:23.991961    2876 start.go:128] duration metric: took 14.162686523s to createHost
	I0831 15:29:23.991974    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:23.992084    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:23.992175    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:23.992259    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:23.992348    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:23.992457    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:23.992584    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:23.992591    2876 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:29:24.059500    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143363.867477750
	
	I0831 15:29:24.059512    2876 fix.go:216] guest clock: 1725143363.867477750
	I0831 15:29:24.059517    2876 fix.go:229] Guest: 2024-08-31 15:29:23.86747775 -0700 PDT Remote: 2024-08-31 15:29:23.991969 -0700 PDT m=+14.752935961 (delta=-124.49125ms)
	I0831 15:29:24.059536    2876 fix.go:200] guest clock delta is within tolerance: -124.49125ms
	I0831 15:29:24.059546    2876 start.go:83] releasing machines lock for "ha-949000", held for 14.230377343s
	I0831 15:29:24.059565    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:24.059706    2876 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:29:24.059819    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:24.060132    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:24.060244    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:24.060319    2876 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:29:24.060346    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:24.060384    2876 ssh_runner.go:195] Run: cat /version.json
	I0831 15:29:24.060396    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:24.060439    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:24.060498    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:24.060525    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:24.060623    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:24.060654    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:24.060746    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:24.060765    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:24.060837    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:24.096035    2876 ssh_runner.go:195] Run: systemctl --version
	I0831 15:29:24.148302    2876 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0831 15:29:24.153275    2876 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:29:24.153315    2876 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:29:24.165840    2876 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:29:24.165854    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:29:24.165972    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:29:24.181258    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:29:24.191149    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:29:24.200150    2876 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:29:24.200197    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:29:24.209198    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:29:24.217930    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:29:24.227002    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:29:24.237048    2876 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:29:24.246383    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:29:24.255322    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:29:24.264369    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:29:24.273487    2876 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:29:24.282138    2876 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:29:24.290220    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:24.385700    2876 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:29:24.407032    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:29:24.407111    2876 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:29:24.421439    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:29:24.437414    2876 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:29:24.451401    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:29:24.463382    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:29:24.474406    2876 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:29:24.507277    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:29:24.517707    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:29:24.532548    2876 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:29:24.535464    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:29:24.542699    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:29:24.557395    2876 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:29:24.662440    2876 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:29:24.769422    2876 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:29:24.769500    2876 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:29:24.784888    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:24.881202    2876 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:29:27.276172    2876 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.394917578s)
	I0831 15:29:27.276233    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:29:27.287739    2876 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:29:27.301676    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:29:27.312754    2876 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:29:27.407771    2876 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:29:27.503429    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:27.614933    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:29:27.628621    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:29:27.641141    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:27.759998    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:29:27.816359    2876 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:29:27.816437    2876 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:29:27.820881    2876 start.go:563] Will wait 60s for crictl version
	I0831 15:29:27.820929    2876 ssh_runner.go:195] Run: which crictl
	I0831 15:29:27.824109    2876 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:29:27.852863    2876 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:29:27.852937    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:29:27.870865    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:29:27.937728    2876 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:29:27.937791    2876 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:29:27.938219    2876 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:29:27.943196    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:29:27.954353    2876 kubeadm.go:883] updating cluster {Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.
0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 Moun
tType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0831 15:29:27.954419    2876 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:29:27.954480    2876 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 15:29:27.967028    2876 docker.go:685] Got preloaded images: 
	I0831 15:29:27.967040    2876 docker.go:691] registry.k8s.io/kube-apiserver:v1.31.0 wasn't preloaded
	I0831 15:29:27.967094    2876 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0831 15:29:27.975409    2876 ssh_runner.go:195] Run: which lz4
	I0831 15:29:27.978323    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 -> /preloaded.tar.lz4
	I0831 15:29:27.978434    2876 ssh_runner.go:195] Run: stat -c "%s %y" /preloaded.tar.lz4
	I0831 15:29:27.981530    2876 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%s %y" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0831 15:29:27.981546    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (342554258 bytes)
	I0831 15:29:28.829399    2876 docker.go:649] duration metric: took 850.988233ms to copy over tarball
	I0831 15:29:28.829466    2876 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I0831 15:29:31.094292    2876 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.264775779s)
	I0831 15:29:31.094306    2876 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0831 15:29:31.120523    2876 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0831 15:29:31.129444    2876 ssh_runner.go:362] scp memory --> /var/lib/docker/image/overlay2/repositories.json (2631 bytes)
	I0831 15:29:31.144462    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:31.255144    2876 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:29:33.625508    2876 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.370311255s)
	I0831 15:29:33.625595    2876 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 15:29:33.642024    2876 docker.go:685] Got preloaded images: -- stdout --
	registry.k8s.io/kube-controller-manager:v1.31.0
	registry.k8s.io/kube-scheduler:v1.31.0
	registry.k8s.io/kube-apiserver:v1.31.0
	registry.k8s.io/kube-proxy:v1.31.0
	registry.k8s.io/etcd:3.5.15-0
	registry.k8s.io/pause:3.10
	registry.k8s.io/coredns/coredns:v1.11.1
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0831 15:29:33.642043    2876 cache_images.go:84] Images are preloaded, skipping loading
	I0831 15:29:33.642059    2876 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.31.0 docker true true} ...
	I0831 15:29:33.642140    2876 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:29:33.642205    2876 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0831 15:29:33.687213    2876 cni.go:84] Creating CNI manager for ""
	I0831 15:29:33.687227    2876 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0831 15:29:33.687238    2876 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0831 15:29:33.687253    2876 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.31.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-949000 NodeName:ha-949000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0831 15:29:33.687355    2876 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-949000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.31.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0831 15:29:33.687380    2876 kube-vip.go:115] generating kube-vip config ...
	I0831 15:29:33.687436    2876 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:29:33.701609    2876 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:29:33.701679    2876 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/super-admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:29:33.701731    2876 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:29:33.709907    2876 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:29:33.709972    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0831 15:29:33.717287    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0831 15:29:33.730443    2876 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:29:33.743765    2876 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0831 15:29:33.758082    2876 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1446 bytes)
	I0831 15:29:33.771561    2876 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:29:33.774412    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:29:33.783869    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:33.875944    2876 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:29:33.891425    2876 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.5
	I0831 15:29:33.891438    2876 certs.go:194] generating shared ca certs ...
	I0831 15:29:33.891448    2876 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:33.891633    2876 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:29:33.891710    2876 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:29:33.891723    2876 certs.go:256] generating profile certs ...
	I0831 15:29:33.891775    2876 certs.go:363] generating signed profile cert for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:29:33.891786    2876 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt with IP's: []
	I0831 15:29:34.044423    2876 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt ...
	I0831 15:29:34.044439    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt: {Name:mkff87193f625d157d1a4f89b0da256c90604083 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.044784    2876 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key ...
	I0831 15:29:34.044793    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key: {Name:mke1833d9b208b07a8ff6dd57d320eb167de83a3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.045031    2876 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.72b12f93
	I0831 15:29:34.045046    2876 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.72b12f93 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.254]
	I0831 15:29:34.207099    2876 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.72b12f93 ...
	I0831 15:29:34.207118    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.72b12f93: {Name:mk38f2742462440beada92d4e254471d0fe85db9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.207433    2876 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.72b12f93 ...
	I0831 15:29:34.207443    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.72b12f93: {Name:mk29a130e2c97d3f060f247819d7c01c723a8502 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.207661    2876 certs.go:381] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.72b12f93 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt
	I0831 15:29:34.207842    2876 certs.go:385] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.72b12f93 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key
	I0831 15:29:34.208036    2876 certs.go:363] generating signed profile cert for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:29:34.208050    2876 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt with IP's: []
	I0831 15:29:34.314095    2876 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt ...
	I0831 15:29:34.314111    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt: {Name:mk708e4939e774d52c9a7d3335e0202d13493538 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.314481    2876 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key ...
	I0831 15:29:34.314489    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key: {Name:mkcfbb0611781f7e5640984b0a9cc91976dc5482 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.314700    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:29:34.314732    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:29:34.314751    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:29:34.314769    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:29:34.314787    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:29:34.314811    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:29:34.314831    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:29:34.314850    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:29:34.314947    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:29:34.314997    2876 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:29:34.315005    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:29:34.315034    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:29:34.315062    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:29:34.315091    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:29:34.315155    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:29:34.315187    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:29:34.315211    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:29:34.315229    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:29:34.315668    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:29:34.335288    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:29:34.355233    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:29:34.374357    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:29:34.393538    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I0831 15:29:34.413840    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0831 15:29:34.433106    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:29:34.452816    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:29:34.472204    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:29:34.492102    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:29:34.512126    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:29:34.530945    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0831 15:29:34.546877    2876 ssh_runner.go:195] Run: openssl version
	I0831 15:29:34.551681    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:29:34.565047    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:29:34.568688    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:29:34.568737    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:29:34.573250    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:29:34.587250    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:29:34.595871    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:29:34.599208    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:29:34.599248    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:29:34.603521    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:29:34.611689    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:29:34.620193    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:29:34.624378    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:29:34.624428    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:29:34.628785    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:29:34.637154    2876 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:29:34.640263    2876 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 15:29:34.640305    2876 kubeadm.go:392] StartCluster: {Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 C
lusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountTy
pe:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:29:34.640393    2876 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0831 15:29:34.652254    2876 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0831 15:29:34.660013    2876 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0831 15:29:34.668312    2876 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0831 15:29:34.675860    2876 kubeadm.go:155] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0831 15:29:34.675868    2876 kubeadm.go:157] found existing configuration files:
	
	I0831 15:29:34.675907    2876 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0831 15:29:34.683169    2876 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I0831 15:29:34.683212    2876 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I0831 15:29:34.690543    2876 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0831 15:29:34.697493    2876 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0831 15:29:34.697539    2876 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I0831 15:29:34.704850    2876 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0831 15:29:34.712593    2876 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0831 15:29:34.712643    2876 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0831 15:29:34.720047    2876 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0831 15:29:34.727239    2876 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0831 15:29:34.727279    2876 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0831 15:29:34.734575    2876 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem"
	I0831 15:29:34.806234    2876 kubeadm.go:310] [init] Using Kubernetes version: v1.31.0
	I0831 15:29:34.806318    2876 kubeadm.go:310] [preflight] Running pre-flight checks
	I0831 15:29:34.880330    2876 kubeadm.go:310] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0831 15:29:34.880424    2876 kubeadm.go:310] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0831 15:29:34.880492    2876 kubeadm.go:310] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I0831 15:29:34.888288    2876 kubeadm.go:310] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0831 15:29:34.931799    2876 out.go:235]   - Generating certificates and keys ...
	I0831 15:29:34.931855    2876 kubeadm.go:310] [certs] Using existing ca certificate authority
	I0831 15:29:34.931917    2876 kubeadm.go:310] [certs] Using existing apiserver certificate and key on disk
	I0831 15:29:35.094247    2876 kubeadm.go:310] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0831 15:29:35.242021    2876 kubeadm.go:310] [certs] Generating "front-proxy-ca" certificate and key
	I0831 15:29:35.553368    2876 kubeadm.go:310] [certs] Generating "front-proxy-client" certificate and key
	I0831 15:29:35.874778    2876 kubeadm.go:310] [certs] Generating "etcd/ca" certificate and key
	I0831 15:29:36.045823    2876 kubeadm.go:310] [certs] Generating "etcd/server" certificate and key
	I0831 15:29:36.046072    2876 kubeadm.go:310] [certs] etcd/server serving cert is signed for DNS names [ha-949000 localhost] and IPs [192.169.0.5 127.0.0.1 ::1]
	I0831 15:29:36.253528    2876 kubeadm.go:310] [certs] Generating "etcd/peer" certificate and key
	I0831 15:29:36.253651    2876 kubeadm.go:310] [certs] etcd/peer serving cert is signed for DNS names [ha-949000 localhost] and IPs [192.169.0.5 127.0.0.1 ::1]
	I0831 15:29:36.362185    2876 kubeadm.go:310] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0831 15:29:36.481613    2876 kubeadm.go:310] [certs] Generating "apiserver-etcd-client" certificate and key
	I0831 15:29:36.595099    2876 kubeadm.go:310] [certs] Generating "sa" key and public key
	I0831 15:29:36.595231    2876 kubeadm.go:310] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0831 15:29:36.687364    2876 kubeadm.go:310] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0831 15:29:36.786350    2876 kubeadm.go:310] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I0831 15:29:36.838505    2876 kubeadm.go:310] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0831 15:29:37.183406    2876 kubeadm.go:310] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0831 15:29:37.330529    2876 kubeadm.go:310] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0831 15:29:37.331123    2876 kubeadm.go:310] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0831 15:29:37.332869    2876 kubeadm.go:310] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0831 15:29:37.354639    2876 out.go:235]   - Booting up control plane ...
	I0831 15:29:37.354715    2876 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0831 15:29:37.354798    2876 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0831 15:29:37.354856    2876 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0831 15:29:37.354940    2876 kubeadm.go:310] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0831 15:29:37.355015    2876 kubeadm.go:310] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0831 15:29:37.355046    2876 kubeadm.go:310] [kubelet-start] Starting the kubelet
	I0831 15:29:37.462381    2876 kubeadm.go:310] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I0831 15:29:37.462478    2876 kubeadm.go:310] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I0831 15:29:37.972217    2876 kubeadm.go:310] [kubelet-check] The kubelet is healthy after 510.286911ms
	I0831 15:29:37.972306    2876 kubeadm.go:310] [api-check] Waiting for a healthy API server. This can take up to 4m0s
	I0831 15:29:43.988604    2876 kubeadm.go:310] [api-check] The API server is healthy after 6.020603512s
	I0831 15:29:44.000520    2876 kubeadm.go:310] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0831 15:29:44.008573    2876 kubeadm.go:310] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0831 15:29:44.022134    2876 kubeadm.go:310] [upload-certs] Skipping phase. Please see --upload-certs
	I0831 15:29:44.022318    2876 kubeadm.go:310] [mark-control-plane] Marking the node ha-949000 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0831 15:29:44.029102    2876 kubeadm.go:310] [bootstrap-token] Using token: zw6kb9.o9r4potygin4i7x2
	I0831 15:29:44.050780    2876 out.go:235]   - Configuring RBAC rules ...
	I0831 15:29:44.050942    2876 kubeadm.go:310] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0831 15:29:44.094287    2876 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0831 15:29:44.099052    2876 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0831 15:29:44.101377    2876 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0831 15:29:44.103328    2876 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0831 15:29:44.105426    2876 kubeadm.go:310] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0831 15:29:44.395210    2876 kubeadm.go:310] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0831 15:29:44.821705    2876 kubeadm.go:310] [addons] Applied essential addon: CoreDNS
	I0831 15:29:45.395130    2876 kubeadm.go:310] [addons] Applied essential addon: kube-proxy
	I0831 15:29:45.396108    2876 kubeadm.go:310] 
	I0831 15:29:45.396158    2876 kubeadm.go:310] Your Kubernetes control-plane has initialized successfully!
	I0831 15:29:45.396163    2876 kubeadm.go:310] 
	I0831 15:29:45.396236    2876 kubeadm.go:310] To start using your cluster, you need to run the following as a regular user:
	I0831 15:29:45.396245    2876 kubeadm.go:310] 
	I0831 15:29:45.396264    2876 kubeadm.go:310]   mkdir -p $HOME/.kube
	I0831 15:29:45.396314    2876 kubeadm.go:310]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0831 15:29:45.396355    2876 kubeadm.go:310]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0831 15:29:45.396359    2876 kubeadm.go:310] 
	I0831 15:29:45.396397    2876 kubeadm.go:310] Alternatively, if you are the root user, you can run:
	I0831 15:29:45.396406    2876 kubeadm.go:310] 
	I0831 15:29:45.396453    2876 kubeadm.go:310]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0831 15:29:45.396458    2876 kubeadm.go:310] 
	I0831 15:29:45.396496    2876 kubeadm.go:310] You should now deploy a pod network to the cluster.
	I0831 15:29:45.396560    2876 kubeadm.go:310] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0831 15:29:45.396617    2876 kubeadm.go:310]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0831 15:29:45.396623    2876 kubeadm.go:310] 
	I0831 15:29:45.396691    2876 kubeadm.go:310] You can now join any number of control-plane nodes by copying certificate authorities
	I0831 15:29:45.396760    2876 kubeadm.go:310] and service account keys on each node and then running the following as root:
	I0831 15:29:45.396766    2876 kubeadm.go:310] 
	I0831 15:29:45.396839    2876 kubeadm.go:310]   kubeadm join control-plane.minikube.internal:8443 --token zw6kb9.o9r4potygin4i7x2 \
	I0831 15:29:45.396919    2876 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 \
	I0831 15:29:45.396939    2876 kubeadm.go:310] 	--control-plane 
	I0831 15:29:45.396943    2876 kubeadm.go:310] 
	I0831 15:29:45.397018    2876 kubeadm.go:310] Then you can join any number of worker nodes by running the following on each as root:
	I0831 15:29:45.397029    2876 kubeadm.go:310] 
	I0831 15:29:45.397093    2876 kubeadm.go:310] kubeadm join control-plane.minikube.internal:8443 --token zw6kb9.o9r4potygin4i7x2 \
	I0831 15:29:45.397173    2876 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 
	I0831 15:29:45.397526    2876 kubeadm.go:310] W0831 22:29:34.618825    1608 common.go:101] your configuration file uses a deprecated API spec: "kubeadm.k8s.io/v1beta3" (kind: "ClusterConfiguration"). Please use 'kubeadm config migrate --old-config old.yaml --new-config new.yaml', which will write the new, similar spec using a newer API version.
	I0831 15:29:45.397751    2876 kubeadm.go:310] W0831 22:29:34.619993    1608 common.go:101] your configuration file uses a deprecated API spec: "kubeadm.k8s.io/v1beta3" (kind: "InitConfiguration"). Please use 'kubeadm config migrate --old-config old.yaml --new-config new.yaml', which will write the new, similar spec using a newer API version.
	I0831 15:29:45.397847    2876 kubeadm.go:310] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0831 15:29:45.397857    2876 cni.go:84] Creating CNI manager for ""
	I0831 15:29:45.397874    2876 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0831 15:29:45.420531    2876 out.go:177] * Configuring CNI (Container Networking Interface) ...
	I0831 15:29:45.477445    2876 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I0831 15:29:45.482633    2876 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.31.0/kubectl ...
	I0831 15:29:45.482643    2876 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2601 bytes)
	I0831 15:29:45.498168    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I0831 15:29:45.749965    2876 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0831 15:29:45.750050    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-949000 minikube.k8s.io/updated_at=2024_08_31T15_29_45_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2 minikube.k8s.io/name=ha-949000 minikube.k8s.io/primary=true
	I0831 15:29:45.750061    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:45.882304    2876 ops.go:34] apiserver oom_adj: -16
	I0831 15:29:45.896818    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:46.398021    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:46.897815    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:47.397274    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:47.897049    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:48.397593    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:48.462357    2876 kubeadm.go:1113] duration metric: took 2.712335704s to wait for elevateKubeSystemPrivileges
	I0831 15:29:48.462374    2876 kubeadm.go:394] duration metric: took 13.821875392s to StartCluster
	I0831 15:29:48.462389    2876 settings.go:142] acquiring lock: {Name:mk4b1b0a7439feab82be8f6d66b4d3c4d11c9b5f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:48.462482    2876 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:29:48.462909    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/kubeconfig: {Name:mkc7259a3f17d77b84078e55eed4ed8b5d2486ca Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:48.463157    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0831 15:29:48.463168    2876 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:29:48.463181    2876 start.go:241] waiting for startup goroutines ...
	I0831 15:29:48.463194    2876 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0831 15:29:48.463223    2876 addons.go:69] Setting storage-provisioner=true in profile "ha-949000"
	I0831 15:29:48.463228    2876 addons.go:69] Setting default-storageclass=true in profile "ha-949000"
	I0831 15:29:48.463245    2876 addons.go:234] Setting addon storage-provisioner=true in "ha-949000"
	I0831 15:29:48.463250    2876 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "ha-949000"
	I0831 15:29:48.463260    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:29:48.463303    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:29:48.463512    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:48.463518    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:48.463528    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:48.463540    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:48.472681    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51052
	I0831 15:29:48.473013    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51054
	I0831 15:29:48.473095    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:48.473332    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:48.473451    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:48.473463    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:48.473652    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:48.473665    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:48.473689    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:48.473921    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:48.474101    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:29:48.474113    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:48.474145    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:48.474214    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:48.474299    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:48.476440    2876 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:29:48.476667    2876 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x48c7c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0831 15:29:48.477025    2876 cert_rotation.go:140] Starting client certificate rotation controller
	I0831 15:29:48.477197    2876 addons.go:234] Setting addon default-storageclass=true in "ha-949000"
	I0831 15:29:48.477218    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:29:48.477428    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:48.477442    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:48.483175    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51056
	I0831 15:29:48.483519    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:48.483886    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:48.483904    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:48.484146    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:48.484254    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:29:48.484334    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:48.484406    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:48.485343    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:48.485904    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51058
	I0831 15:29:48.486187    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:48.486486    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:48.486495    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:48.486696    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:48.487040    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:48.487078    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:48.495680    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51060
	I0831 15:29:48.496017    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:48.496360    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:48.496389    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:48.496611    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:48.496715    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:29:48.496791    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:48.496872    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:48.497794    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:48.497926    2876 addons.go:431] installing /etc/kubernetes/addons/storageclass.yaml
	I0831 15:29:48.497934    2876 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0831 15:29:48.497944    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:48.498021    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:48.498099    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:48.498200    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:48.498277    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:48.507200    2876 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0831 15:29:48.527696    2876 addons.go:431] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0831 15:29:48.527708    2876 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0831 15:29:48.527725    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:48.527878    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:48.527981    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:48.528082    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:48.528217    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:48.528370    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.169.0.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0831 15:29:48.564053    2876 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0831 15:29:48.586435    2876 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0831 15:29:48.827708    2876 start.go:971] {"host.minikube.internal": 192.169.0.1} host record injected into CoreDNS's ConfigMap
	I0831 15:29:48.827730    2876 main.go:141] libmachine: Making call to close driver server
	I0831 15:29:48.827739    2876 main.go:141] libmachine: (ha-949000) Calling .Close
	I0831 15:29:48.827907    2876 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:29:48.827916    2876 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:29:48.827922    2876 main.go:141] libmachine: Making call to close driver server
	I0831 15:29:48.827926    2876 main.go:141] libmachine: (ha-949000) Calling .Close
	I0831 15:29:48.828046    2876 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:29:48.828049    2876 main.go:141] libmachine: (ha-949000) DBG | Closing plugin on server side
	I0831 15:29:48.828058    2876 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:29:48.828113    2876 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I0831 15:29:48.828125    2876 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I0831 15:29:48.828210    2876 round_trippers.go:463] GET https://192.169.0.254:8443/apis/storage.k8s.io/v1/storageclasses
	I0831 15:29:48.828215    2876 round_trippers.go:469] Request Headers:
	I0831 15:29:48.828223    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:29:48.828227    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:29:48.833724    2876 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0831 15:29:48.834156    2876 round_trippers.go:463] PUT https://192.169.0.254:8443/apis/storage.k8s.io/v1/storageclasses/standard
	I0831 15:29:48.834163    2876 round_trippers.go:469] Request Headers:
	I0831 15:29:48.834169    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:29:48.834199    2876 round_trippers.go:473]     Content-Type: application/json
	I0831 15:29:48.834205    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:29:48.835718    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:29:48.835861    2876 main.go:141] libmachine: Making call to close driver server
	I0831 15:29:48.835876    2876 main.go:141] libmachine: (ha-949000) Calling .Close
	I0831 15:29:48.836028    2876 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:29:48.836037    2876 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:29:48.836048    2876 main.go:141] libmachine: (ha-949000) DBG | Closing plugin on server side
	I0831 15:29:49.019783    2876 main.go:141] libmachine: Making call to close driver server
	I0831 15:29:49.019796    2876 main.go:141] libmachine: (ha-949000) Calling .Close
	I0831 15:29:49.019979    2876 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:29:49.019989    2876 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:29:49.019994    2876 main.go:141] libmachine: Making call to close driver server
	I0831 15:29:49.019999    2876 main.go:141] libmachine: (ha-949000) Calling .Close
	I0831 15:29:49.019999    2876 main.go:141] libmachine: (ha-949000) DBG | Closing plugin on server side
	I0831 15:29:49.020151    2876 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:29:49.020153    2876 main.go:141] libmachine: (ha-949000) DBG | Closing plugin on server side
	I0831 15:29:49.020159    2876 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:29:49.059498    2876 out.go:177] * Enabled addons: default-storageclass, storage-provisioner
	I0831 15:29:49.117324    2876 addons.go:510] duration metric: took 654.121351ms for enable addons: enabled=[default-storageclass storage-provisioner]
	I0831 15:29:49.117374    2876 start.go:246] waiting for cluster config update ...
	I0831 15:29:49.117390    2876 start.go:255] writing updated cluster config ...
	I0831 15:29:49.155430    2876 out.go:201] 
	I0831 15:29:49.192527    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:29:49.192625    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:29:49.214378    2876 out.go:177] * Starting "ha-949000-m02" control-plane node in "ha-949000" cluster
	I0831 15:29:49.272137    2876 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:29:49.272171    2876 cache.go:56] Caching tarball of preloaded images
	I0831 15:29:49.272338    2876 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:29:49.272356    2876 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:29:49.272445    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:29:49.273113    2876 start.go:360] acquireMachinesLock for ha-949000-m02: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:29:49.273204    2876 start.go:364] duration metric: took 68.322µs to acquireMachinesLock for "ha-949000-m02"
	I0831 15:29:49.273234    2876 start.go:93] Provisioning new machine with config: &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks
:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m02 IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:29:49.273329    2876 start.go:125] createHost starting for "m02" (driver="hyperkit")
	I0831 15:29:49.296266    2876 out.go:235] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0831 15:29:49.296429    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:49.296488    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:49.306391    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51065
	I0831 15:29:49.306732    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:49.307039    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:49.307051    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:49.307254    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:49.307374    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:29:49.307457    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:29:49.307559    2876 start.go:159] libmachine.API.Create for "ha-949000" (driver="hyperkit")
	I0831 15:29:49.307576    2876 client.go:168] LocalClient.Create starting
	I0831 15:29:49.307604    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem
	I0831 15:29:49.307643    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:29:49.307655    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:29:49.307696    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem
	I0831 15:29:49.307726    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:29:49.307735    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:29:49.307749    2876 main.go:141] libmachine: Running pre-create checks...
	I0831 15:29:49.307754    2876 main.go:141] libmachine: (ha-949000-m02) Calling .PreCreateCheck
	I0831 15:29:49.307836    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:49.307906    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetConfigRaw
	I0831 15:29:49.333695    2876 main.go:141] libmachine: Creating machine...
	I0831 15:29:49.333716    2876 main.go:141] libmachine: (ha-949000-m02) Calling .Create
	I0831 15:29:49.333916    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:49.334092    2876 main.go:141] libmachine: (ha-949000-m02) DBG | I0831 15:29:49.333909    2898 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:29:49.334195    2876 main.go:141] libmachine: (ha-949000-m02) Downloading /Users/jenkins/minikube-integration/18943-957/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso...
	I0831 15:29:49.534537    2876 main.go:141] libmachine: (ha-949000-m02) DBG | I0831 15:29:49.534440    2898 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa...
	I0831 15:29:49.629999    2876 main.go:141] libmachine: (ha-949000-m02) DBG | I0831 15:29:49.629917    2898 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk...
	I0831 15:29:49.630021    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Writing magic tar header
	I0831 15:29:49.630031    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Writing SSH key tar header
	I0831 15:29:49.630578    2876 main.go:141] libmachine: (ha-949000-m02) DBG | I0831 15:29:49.630526    2898 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02 ...
	I0831 15:29:49.986563    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:49.986593    2876 main.go:141] libmachine: (ha-949000-m02) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid
	I0831 15:29:49.986663    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Using UUID 23e5d675-5201-4f3d-86b7-b25c818528d1
	I0831 15:29:50.021467    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Generated MAC 92:7:3c:3f:ee:b7
	I0831 15:29:50.021484    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:29:50.021548    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"23e5d675-5201-4f3d-86b7-b25c818528d1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:29:50.021582    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"23e5d675-5201-4f3d-86b7-b25c818528d1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:29:50.021623    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "23e5d675-5201-4f3d-86b7-b25c818528d1", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:29:50.021665    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 23e5d675-5201-4f3d-86b7-b25c818528d1 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:29:50.021684    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:29:50.024624    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: Pid is 2899
	I0831 15:29:50.025044    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 0
	I0831 15:29:50.025058    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:50.025119    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:29:50.026207    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:29:50.026276    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:50.026305    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:50.026350    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:50.026373    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:50.026416    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:50.032754    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:29:50.041001    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:29:50.041896    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:29:50.041918    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:29:50.041929    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:29:50.041946    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:29:50.432260    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:29:50.432276    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:29:50.547071    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:29:50.547090    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:29:50.547112    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:29:50.547127    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:29:50.547965    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:29:50.547973    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:29:52.027270    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 1
	I0831 15:29:52.027288    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:52.027415    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:29:52.028177    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:29:52.028225    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:52.028236    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:52.028247    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:52.028254    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:52.028263    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:54.029110    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 2
	I0831 15:29:54.029126    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:54.029231    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:29:54.029999    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:29:54.030057    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:54.030075    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:54.030087    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:54.030095    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:54.030103    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:56.031274    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 3
	I0831 15:29:56.031292    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:56.031369    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:29:56.032155    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:29:56.032168    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:56.032178    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:56.032196    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:56.032213    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:56.032224    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:56.132338    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:56 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:29:56.132386    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:56 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:29:56.132396    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:56 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:29:56.155372    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:56 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:29:58.032308    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 4
	I0831 15:29:58.032325    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:58.032424    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:29:58.033214    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:29:58.033247    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:58.033259    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:58.033269    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:58.033278    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:58.033287    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:30:00.033449    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 5
	I0831 15:30:00.033465    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:30:00.033544    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:30:00.034313    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:30:00.034404    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:30:00.034418    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:30:00.034426    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found match: 92:7:3c:3f:ee:b7
	I0831 15:30:00.034433    2876 main.go:141] libmachine: (ha-949000-m02) DBG | IP: 192.169.0.6
	I0831 15:30:00.034475    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetConfigRaw
	I0831 15:30:00.035147    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:00.035249    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:00.035348    2876 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0831 15:30:00.035357    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:30:00.035434    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:30:00.035493    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:30:00.036274    2876 main.go:141] libmachine: Detecting operating system of created instance...
	I0831 15:30:00.036284    2876 main.go:141] libmachine: Waiting for SSH to be available...
	I0831 15:30:00.036289    2876 main.go:141] libmachine: Getting to WaitForSSH function...
	I0831 15:30:00.036293    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:00.036398    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:00.036485    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:00.036575    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:00.036655    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:00.036771    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:00.036969    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:00.036976    2876 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0831 15:30:01.059248    2876 main.go:141] libmachine: Error dialing TCP: ssh: handshake failed: ssh: unable to authenticate, attempted methods [none publickey], no supported methods remain
	I0831 15:30:04.124333    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:30:04.124345    2876 main.go:141] libmachine: Detecting the provisioner...
	I0831 15:30:04.124351    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.124488    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.124590    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.124683    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.124778    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.124921    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.125101    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.125110    2876 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0831 15:30:04.190272    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0831 15:30:04.190323    2876 main.go:141] libmachine: found compatible host: buildroot
	I0831 15:30:04.190329    2876 main.go:141] libmachine: Provisioning with buildroot...
	I0831 15:30:04.190334    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:30:04.190465    2876 buildroot.go:166] provisioning hostname "ha-949000-m02"
	I0831 15:30:04.190476    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:30:04.190558    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.190652    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.190763    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.190844    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.190943    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.191068    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.191204    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.191213    2876 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m02 && echo "ha-949000-m02" | sudo tee /etc/hostname
	I0831 15:30:04.267934    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m02
	
	I0831 15:30:04.267948    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.268081    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.268202    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.268299    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.268391    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.268525    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.268665    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.268684    2876 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:30:04.340314    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:30:04.340330    2876 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:30:04.340340    2876 buildroot.go:174] setting up certificates
	I0831 15:30:04.340346    2876 provision.go:84] configureAuth start
	I0831 15:30:04.340353    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:30:04.340483    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:30:04.340577    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.340665    2876 provision.go:143] copyHostCerts
	I0831 15:30:04.340691    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:30:04.340751    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:30:04.340757    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:30:04.340904    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:30:04.341121    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:30:04.341161    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:30:04.341166    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:30:04.341243    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:30:04.341390    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:30:04.341427    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:30:04.341432    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:30:04.341508    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:30:04.341670    2876 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m02 san=[127.0.0.1 192.169.0.6 ha-949000-m02 localhost minikube]
	I0831 15:30:04.509456    2876 provision.go:177] copyRemoteCerts
	I0831 15:30:04.509508    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:30:04.509523    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.509674    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.509762    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.509874    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.509973    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:30:04.550810    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:30:04.550883    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:30:04.571982    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:30:04.572058    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:30:04.592601    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:30:04.592680    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0831 15:30:04.612516    2876 provision.go:87] duration metric: took 272.157929ms to configureAuth
	I0831 15:30:04.612531    2876 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:30:04.612691    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:30:04.612706    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:04.612851    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.612970    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.613064    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.613150    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.613227    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.613345    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.613483    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.613491    2876 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:30:04.678333    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:30:04.678345    2876 buildroot.go:70] root file system type: tmpfs
	I0831 15:30:04.678436    2876 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:30:04.678450    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.678582    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.678669    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.678767    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.678846    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.678978    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.679124    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.679167    2876 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:30:04.756204    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:30:04.756224    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.756411    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.756527    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.756630    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.756734    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.756851    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.757006    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.757027    2876 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:30:06.370825    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:30:06.370840    2876 main.go:141] libmachine: Checking connection to Docker...
	I0831 15:30:06.370855    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetURL
	I0831 15:30:06.370996    2876 main.go:141] libmachine: Docker is up and running!
	I0831 15:30:06.371003    2876 main.go:141] libmachine: Reticulating splines...
	I0831 15:30:06.371008    2876 client.go:171] duration metric: took 17.063185858s to LocalClient.Create
	I0831 15:30:06.371017    2876 start.go:167] duration metric: took 17.063218984s to libmachine.API.Create "ha-949000"
	I0831 15:30:06.371023    2876 start.go:293] postStartSetup for "ha-949000-m02" (driver="hyperkit")
	I0831 15:30:06.371029    2876 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:30:06.371039    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:06.371176    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:30:06.371190    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:06.371279    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:06.371365    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:06.371448    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:06.371522    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:30:06.410272    2876 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:30:06.413456    2876 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:30:06.413467    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:30:06.413573    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:30:06.413753    2876 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:30:06.413762    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:30:06.413962    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:30:06.421045    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:30:06.440540    2876 start.go:296] duration metric: took 69.508758ms for postStartSetup
	I0831 15:30:06.440562    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetConfigRaw
	I0831 15:30:06.441179    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:30:06.441343    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:30:06.441726    2876 start.go:128] duration metric: took 17.168146238s to createHost
	I0831 15:30:06.441741    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:06.441826    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:06.441909    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:06.442008    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:06.442102    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:06.442220    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:06.442339    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:06.442346    2876 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:30:06.507669    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143406.563138986
	
	I0831 15:30:06.507682    2876 fix.go:216] guest clock: 1725143406.563138986
	I0831 15:30:06.507687    2876 fix.go:229] Guest: 2024-08-31 15:30:06.563138986 -0700 PDT Remote: 2024-08-31 15:30:06.441735 -0700 PDT m=+57.202103081 (delta=121.403986ms)
	I0831 15:30:06.507698    2876 fix.go:200] guest clock delta is within tolerance: 121.403986ms
	I0831 15:30:06.507701    2876 start.go:83] releasing machines lock for "ha-949000-m02", held for 17.234244881s
	I0831 15:30:06.507719    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:06.507845    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:30:06.534518    2876 out.go:177] * Found network options:
	I0831 15:30:06.585154    2876 out.go:177]   - NO_PROXY=192.169.0.5
	W0831 15:30:06.608372    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:30:06.608434    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:06.609377    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:06.609624    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:06.609725    2876 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:30:06.609763    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	W0831 15:30:06.609837    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:30:06.609978    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:06.609993    2876 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:30:06.610018    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:06.610265    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:06.610300    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:06.610460    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:06.610487    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:06.610621    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:30:06.610643    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:06.610806    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	W0831 15:30:06.649012    2876 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:30:06.649075    2876 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:30:06.693849    2876 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:30:06.693863    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:30:06.693938    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:30:06.709316    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:30:06.718380    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:30:06.727543    2876 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:30:06.727609    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:30:06.736698    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:30:06.745615    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:30:06.755140    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:30:06.764398    2876 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:30:06.773464    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:30:06.782661    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:30:06.791918    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:30:06.801132    2876 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:30:06.809259    2876 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:30:06.817528    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:06.918051    2876 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:30:06.937658    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:30:06.937726    2876 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:30:06.952225    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:30:06.964364    2876 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:30:06.981641    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:30:06.992676    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:30:07.003746    2876 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:30:07.061399    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:30:07.071765    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:30:07.086915    2876 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:30:07.089960    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:30:07.097339    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:30:07.110902    2876 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:30:07.218878    2876 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:30:07.327438    2876 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:30:07.327478    2876 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:30:07.343077    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:07.455166    2876 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:30:09.753051    2876 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.297833346s)
	I0831 15:30:09.753112    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:30:09.763410    2876 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:30:09.776197    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:30:09.788015    2876 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:30:09.886287    2876 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:30:09.979666    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:10.091986    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:30:10.105474    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:30:10.116526    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:10.223654    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:30:10.284365    2876 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:30:10.284447    2876 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:30:10.288841    2876 start.go:563] Will wait 60s for crictl version
	I0831 15:30:10.288894    2876 ssh_runner.go:195] Run: which crictl
	I0831 15:30:10.292674    2876 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:30:10.327492    2876 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:30:10.327571    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:30:10.348428    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:30:10.394804    2876 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:30:10.438643    2876 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:30:10.460438    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:30:10.460677    2876 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:30:10.463911    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:30:10.474227    2876 mustload.go:65] Loading cluster: ha-949000
	I0831 15:30:10.474382    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:30:10.474620    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:30:10.474636    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:30:10.483465    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51091
	I0831 15:30:10.483852    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:30:10.484170    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:30:10.484182    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:30:10.484380    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:30:10.484504    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:30:10.484591    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:30:10.484661    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:30:10.485631    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:30:10.485888    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:30:10.485912    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:30:10.494468    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51093
	I0831 15:30:10.494924    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:30:10.495238    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:30:10.495250    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:30:10.495476    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:30:10.495585    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:30:10.495693    2876 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.6
	I0831 15:30:10.495700    2876 certs.go:194] generating shared ca certs ...
	I0831 15:30:10.495711    2876 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:30:10.495883    2876 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:30:10.495953    2876 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:30:10.495961    2876 certs.go:256] generating profile certs ...
	I0831 15:30:10.496069    2876 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:30:10.496092    2876 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.2cd83952
	I0831 15:30:10.496104    2876 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.2cd83952 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.254]
	I0831 15:30:10.585710    2876 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.2cd83952 ...
	I0831 15:30:10.585732    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.2cd83952: {Name:mkfd98043f041b827744dcc9a0bc27d9f7ba3a8d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:30:10.586080    2876 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.2cd83952 ...
	I0831 15:30:10.586093    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.2cd83952: {Name:mk6025bd0561394827636d384e273ec532f21510 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:30:10.586307    2876 certs.go:381] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.2cd83952 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt
	I0831 15:30:10.586527    2876 certs.go:385] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.2cd83952 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key
	I0831 15:30:10.586791    2876 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:30:10.586800    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:30:10.586823    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:30:10.586842    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:30:10.586860    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:30:10.586879    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:30:10.586902    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:30:10.586921    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:30:10.586939    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:30:10.587027    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:30:10.587073    2876 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:30:10.587082    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:30:10.587115    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:30:10.587145    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:30:10.587174    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:30:10.587237    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:30:10.587271    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:30:10.587293    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:30:10.587312    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:30:10.587343    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:30:10.587493    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:30:10.587598    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:30:10.587689    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:30:10.587790    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:30:10.619319    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0831 15:30:10.622586    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0831 15:30:10.631798    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0831 15:30:10.634863    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0831 15:30:10.644806    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0831 15:30:10.648392    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0831 15:30:10.657224    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0831 15:30:10.660506    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1679 bytes)
	I0831 15:30:10.668998    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0831 15:30:10.672282    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0831 15:30:10.681734    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0831 15:30:10.685037    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0831 15:30:10.697579    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:30:10.717100    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:30:10.736755    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:30:10.757074    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:30:10.776635    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I0831 15:30:10.796052    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0831 15:30:10.815309    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:30:10.834549    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:30:10.854663    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:30:10.873734    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:30:10.892872    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:30:10.912223    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0831 15:30:10.925669    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0831 15:30:10.939310    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0831 15:30:10.952723    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1679 bytes)
	I0831 15:30:10.966203    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0831 15:30:10.980670    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0831 15:30:10.994195    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0831 15:30:11.007818    2876 ssh_runner.go:195] Run: openssl version
	I0831 15:30:11.012076    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:30:11.021306    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:30:11.024674    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:30:11.024710    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:30:11.028962    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:30:11.038172    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:30:11.048226    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:30:11.051704    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:30:11.051746    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:30:11.056026    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:30:11.065281    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:30:11.074586    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:30:11.077977    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:30:11.078018    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:30:11.082263    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:30:11.091560    2876 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:30:11.094606    2876 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 15:30:11.094641    2876 kubeadm.go:934] updating node {m02 192.169.0.6 8443 v1.31.0 docker true true} ...
	I0831 15:30:11.094696    2876 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.6
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:30:11.094712    2876 kube-vip.go:115] generating kube-vip config ...
	I0831 15:30:11.094743    2876 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:30:11.107306    2876 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:30:11.107348    2876 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:30:11.107400    2876 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:30:11.116476    2876 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.31.0: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.31.0': No such file or directory
	
	Initiating transfer...
	I0831 15:30:11.116538    2876 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.31.0
	I0831 15:30:11.125199    2876 download.go:107] Downloading: https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubelet.sha256 -> /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet
	I0831 15:30:11.125199    2876 download.go:107] Downloading: https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl.sha256 -> /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl
	I0831 15:30:11.125202    2876 download.go:107] Downloading: https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubeadm.sha256 -> /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm
	I0831 15:30:13.495982    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl -> /var/lib/minikube/binaries/v1.31.0/kubectl
	I0831 15:30:13.496079    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubectl
	I0831 15:30:13.499639    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubectl: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubectl': No such file or directory
	I0831 15:30:13.499660    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl --> /var/lib/minikube/binaries/v1.31.0/kubectl (56381592 bytes)
	I0831 15:30:14.245316    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm -> /var/lib/minikube/binaries/v1.31.0/kubeadm
	I0831 15:30:14.245403    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubeadm
	I0831 15:30:14.249019    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubeadm: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubeadm': No such file or directory
	I0831 15:30:14.249045    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm --> /var/lib/minikube/binaries/v1.31.0/kubeadm (58290328 bytes)
	I0831 15:30:14.305452    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:30:14.335903    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet -> /var/lib/minikube/binaries/v1.31.0/kubelet
	I0831 15:30:14.336035    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubelet
	I0831 15:30:14.348689    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubelet: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubelet': No such file or directory
	I0831 15:30:14.348746    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet --> /var/lib/minikube/binaries/v1.31.0/kubelet (76865848 bytes)
	I0831 15:30:14.608960    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0831 15:30:14.617331    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:30:14.630716    2876 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:30:14.643952    2876 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:30:14.657665    2876 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:30:14.660616    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:30:14.670825    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:14.766762    2876 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:30:14.782036    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:30:14.782341    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:30:14.782363    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:30:14.791218    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51120
	I0831 15:30:14.791554    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:30:14.791943    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:30:14.791962    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:30:14.792169    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:30:14.792281    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:30:14.792379    2876 start.go:317] joinCluster: &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 Clu
sterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpira
tion:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:30:14.792482    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm token create --print-join-command --ttl=0"
	I0831 15:30:14.792500    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:30:14.792589    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:30:14.792677    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:30:14.792804    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:30:14.792889    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:30:14.904364    2876 start.go:343] trying to join control-plane node "m02" to cluster: &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:30:14.904404    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm join control-plane.minikube.internal:8443 --token sa5gl8.nk4lqkhvqrn6uouk --discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-949000-m02 --control-plane --apiserver-advertise-address=192.169.0.6 --apiserver-bind-port=8443"
	I0831 15:30:43.067719    2876 ssh_runner.go:235] Completed: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm join control-plane.minikube.internal:8443 --token sa5gl8.nk4lqkhvqrn6uouk --discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-949000-m02 --control-plane --apiserver-advertise-address=192.169.0.6 --apiserver-bind-port=8443": (28.162893612s)
	I0831 15:30:43.067762    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo systemctl daemon-reload && sudo systemctl enable kubelet && sudo systemctl start kubelet"
	I0831 15:30:43.495593    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-949000-m02 minikube.k8s.io/updated_at=2024_08_31T15_30_43_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2 minikube.k8s.io/name=ha-949000 minikube.k8s.io/primary=false
	I0831 15:30:43.584878    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig taint nodes ha-949000-m02 node-role.kubernetes.io/control-plane:NoSchedule-
	I0831 15:30:43.672222    2876 start.go:319] duration metric: took 28.879433845s to joinCluster
	I0831 15:30:43.672264    2876 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:30:43.672464    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:30:43.696001    2876 out.go:177] * Verifying Kubernetes components...
	I0831 15:30:43.753664    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:43.969793    2876 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:30:43.995704    2876 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:30:43.995955    2876 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x48c7c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:30:43.995999    2876 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:30:43.996168    2876 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m02" to be "Ready" ...
	I0831 15:30:43.996224    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:43.996229    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:43.996246    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:43.996253    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:44.008886    2876 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0831 15:30:44.496443    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:44.496458    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:44.496465    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:44.496468    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:44.499732    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:44.996970    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:44.996984    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:44.996990    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:44.996993    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:45.000189    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:45.496917    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:45.496930    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:45.496936    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:45.496939    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:45.498866    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:30:45.996558    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:45.996579    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:45.996604    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:45.996626    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:45.999357    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:45.999667    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:46.496895    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:46.496907    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:46.496914    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:46.496917    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:46.499220    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:46.996382    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:46.996397    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:46.996403    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:46.996406    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:46.998788    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:47.497035    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:47.497048    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:47.497055    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:47.497059    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:47.499487    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:47.996662    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:47.996675    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:47.996695    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:47.996699    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:47.998935    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:48.496588    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:48.496603    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:48.496610    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:48.496613    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:48.498806    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:48.499160    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:48.996774    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:48.996800    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:48.996806    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:48.996810    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:48.998862    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:49.496728    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:49.496741    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:49.496748    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:49.496753    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:49.500270    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:49.996536    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:49.996548    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:49.996555    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:49.996560    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:49.998977    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:50.496423    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:50.496441    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:50.496452    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:50.496458    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:50.499488    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:50.499941    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:50.996502    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:50.996515    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:50.996520    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:50.996525    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:50.998339    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:30:51.496978    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:51.496999    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:51.497011    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:51.497018    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:51.499859    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:51.997186    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:51.997200    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:51.997207    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:51.997210    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:52.000228    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:52.498065    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:52.498084    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:52.498093    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:52.498097    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:52.500425    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:52.500868    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:52.996733    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:52.996786    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:52.996804    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:52.996819    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:52.999878    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:53.496732    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:53.496752    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:53.496764    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:53.496772    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:53.499723    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:53.996635    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:53.996698    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:53.996722    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:53.996730    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:54.000327    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:54.496855    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:54.496875    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:54.496883    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:54.496888    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:54.499247    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:54.996676    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:54.996692    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:54.996701    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:54.996706    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:54.999066    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:54.999477    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:55.496949    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:55.496960    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:55.496967    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:55.496971    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:55.499074    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:55.996611    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:55.996627    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:55.996644    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:55.996651    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:55.999061    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:56.497363    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:56.497376    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:56.497383    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:56.497386    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:56.499540    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:56.997791    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:56.997810    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:56.997822    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:56.997828    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:57.001116    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:57.001481    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:57.497843    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:57.497862    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:57.497874    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:57.497881    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:57.500770    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:57.998298    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:57.998324    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:57.998335    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:57.998344    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:58.002037    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:58.496643    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:58.496664    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:58.496677    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:58.496683    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:58.499466    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:58.997398    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:58.997468    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:58.997484    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:58.997490    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:59.000768    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:59.498644    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:59.498668    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:59.498680    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:59.498685    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:59.502573    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:59.503046    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:59.996689    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:59.996715    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:59.996765    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:59.996773    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:59.999409    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:00.496654    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:00.496668    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.496677    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.496681    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.498585    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.499019    2876 node_ready.go:49] node "ha-949000-m02" has status "Ready":"True"
	I0831 15:31:00.499031    2876 node_ready.go:38] duration metric: took 16.50261118s for node "ha-949000-m02" to be "Ready" ...
	I0831 15:31:00.499038    2876 pod_ready.go:36] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:31:00.499081    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:31:00.499087    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.499092    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.499095    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.502205    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:00.506845    2876 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.506892    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:31:00.506897    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.506903    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.506908    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.508659    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.509078    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:00.509085    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.509091    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.509094    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.510447    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.510831    2876 pod_ready.go:93] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:00.510839    2876 pod_ready.go:82] duration metric: took 3.983743ms for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.510852    2876 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.510887    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-snq8s
	I0831 15:31:00.510892    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.510897    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.510901    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.512274    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.512740    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:00.512747    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.512752    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.512757    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.514085    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.514446    2876 pod_ready.go:93] pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:00.514457    2876 pod_ready.go:82] duration metric: took 3.596287ms for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.514464    2876 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.514501    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000
	I0831 15:31:00.514506    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.514512    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.514515    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.517897    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:00.518307    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:00.518314    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.518320    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.518324    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.519756    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.520128    2876 pod_ready.go:93] pod "etcd-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:00.520138    2876 pod_ready.go:82] duration metric: took 5.668748ms for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.520144    2876 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.520177    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m02
	I0831 15:31:00.520182    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.520187    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.520191    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.521454    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.521852    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:00.521860    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.521865    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.521870    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.523054    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.523372    2876 pod_ready.go:93] pod "etcd-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:00.523381    2876 pod_ready.go:82] duration metric: took 3.231682ms for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.523393    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.698293    2876 request.go:632] Waited for 174.813181ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:31:00.698344    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:31:00.698420    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.698432    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.698439    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.701539    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:00.897673    2876 request.go:632] Waited for 195.424003ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:00.897783    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:00.897794    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.897805    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.897814    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.900981    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:00.901407    2876 pod_ready.go:93] pod "kube-apiserver-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:00.901419    2876 pod_ready.go:82] duration metric: took 378.015429ms for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.901429    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:01.097805    2876 request.go:632] Waited for 196.320526ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:31:01.097926    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:31:01.097936    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:01.097947    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:01.097955    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:01.100563    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:01.298122    2876 request.go:632] Waited for 197.162644ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:01.298157    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:01.298162    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:01.298168    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:01.298172    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:01.300402    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:01.300781    2876 pod_ready.go:93] pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:01.300791    2876 pod_ready.go:82] duration metric: took 399.34942ms for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:01.300807    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:01.497316    2876 request.go:632] Waited for 196.39746ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:31:01.497376    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:31:01.497387    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:01.497397    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:01.497405    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:01.500651    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:01.698231    2876 request.go:632] Waited for 196.759957ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:01.698322    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:01.698333    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:01.698344    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:01.698353    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:01.701256    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:01.701766    2876 pod_ready.go:93] pod "kube-controller-manager-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:01.701775    2876 pod_ready.go:82] duration metric: took 400.954779ms for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:01.701785    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:01.898783    2876 request.go:632] Waited for 196.946643ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:31:01.898903    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:31:01.898917    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:01.898929    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:01.898938    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:01.902347    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:02.097749    2876 request.go:632] Waited for 194.738931ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:02.097815    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:02.097824    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:02.097834    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:02.097843    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:02.101525    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:02.102016    2876 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:02.102028    2876 pod_ready.go:82] duration metric: took 400.230387ms for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:02.102037    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:02.296929    2876 request.go:632] Waited for 194.771963ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:31:02.296979    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:31:02.296996    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:02.297010    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:02.297016    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:02.300518    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:02.498356    2876 request.go:632] Waited for 197.140595ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:02.498409    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:02.498414    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:02.498421    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:02.498425    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:02.500151    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:02.500554    2876 pod_ready.go:93] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:02.500564    2876 pod_ready.go:82] duration metric: took 398.515508ms for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:02.500577    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:02.697756    2876 request.go:632] Waited for 197.121926ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:31:02.697847    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:31:02.697859    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:02.697871    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:02.697879    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:02.701227    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:02.896975    2876 request.go:632] Waited for 195.16614ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:02.897029    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:02.897044    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:02.897050    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:02.897054    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:02.899135    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:02.899494    2876 pod_ready.go:93] pod "kube-proxy-q7ndn" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:02.899504    2876 pod_ready.go:82] duration metric: took 398.915896ms for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:02.899511    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:03.098441    2876 request.go:632] Waited for 198.871316ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:31:03.098576    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:31:03.098587    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.098599    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.098606    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.101995    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:03.297740    2876 request.go:632] Waited for 194.927579ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:03.297801    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:03.297842    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.297855    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.297863    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.300956    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:03.301560    2876 pod_ready.go:93] pod "kube-scheduler-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:03.301572    2876 pod_ready.go:82] duration metric: took 402.049602ms for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:03.301580    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:03.498380    2876 request.go:632] Waited for 196.707011ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:31:03.498472    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:31:03.498482    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.498494    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.498505    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.502174    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:03.696864    2876 request.go:632] Waited for 194.200989ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:03.696916    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:03.696926    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.696938    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.696944    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.700327    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:03.700769    2876 pod_ready.go:93] pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:03.700782    2876 pod_ready.go:82] duration metric: took 399.189338ms for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:03.700791    2876 pod_ready.go:39] duration metric: took 3.201699285s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:31:03.700816    2876 api_server.go:52] waiting for apiserver process to appear ...
	I0831 15:31:03.700877    2876 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:31:03.712528    2876 api_server.go:72] duration metric: took 20.039964419s to wait for apiserver process to appear ...
	I0831 15:31:03.712539    2876 api_server.go:88] waiting for apiserver healthz status ...
	I0831 15:31:03.712554    2876 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0831 15:31:03.715722    2876 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0831 15:31:03.715760    2876 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0831 15:31:03.715765    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.715771    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.715775    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.716371    2876 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 15:31:03.716424    2876 api_server.go:141] control plane version: v1.31.0
	I0831 15:31:03.716433    2876 api_server.go:131] duration metric: took 3.890107ms to wait for apiserver health ...
	I0831 15:31:03.716440    2876 system_pods.go:43] waiting for kube-system pods to appear ...
	I0831 15:31:03.898331    2876 request.go:632] Waited for 181.827666ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:31:03.898385    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:31:03.898446    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.898465    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.898473    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.903436    2876 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:31:03.906746    2876 system_pods.go:59] 17 kube-system pods found
	I0831 15:31:03.906767    2876 system_pods.go:61] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:31:03.906771    2876 system_pods.go:61] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:31:03.906775    2876 system_pods.go:61] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:31:03.906778    2876 system_pods.go:61] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:31:03.906783    2876 system_pods.go:61] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:31:03.906786    2876 system_pods.go:61] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:31:03.906789    2876 system_pods.go:61] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:31:03.906793    2876 system_pods.go:61] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:31:03.906796    2876 system_pods.go:61] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:31:03.906799    2876 system_pods.go:61] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:31:03.906802    2876 system_pods.go:61] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:31:03.906805    2876 system_pods.go:61] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:31:03.906810    2876 system_pods.go:61] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:31:03.906814    2876 system_pods.go:61] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:31:03.906816    2876 system_pods.go:61] "kube-vip-ha-949000" [933b8e54-299e-44c1-8dea-69aba92adbd4] Running
	I0831 15:31:03.906819    2876 system_pods.go:61] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:31:03.906824    2876 system_pods.go:61] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:31:03.906830    2876 system_pods.go:74] duration metric: took 190.381994ms to wait for pod list to return data ...
	I0831 15:31:03.906835    2876 default_sa.go:34] waiting for default service account to be created ...
	I0831 15:31:04.096833    2876 request.go:632] Waited for 189.933385ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:31:04.096919    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:31:04.096929    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:04.096940    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:04.096947    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:04.100750    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:04.100942    2876 default_sa.go:45] found service account: "default"
	I0831 15:31:04.100955    2876 default_sa.go:55] duration metric: took 194.103228ms for default service account to be created ...
	I0831 15:31:04.100963    2876 system_pods.go:116] waiting for k8s-apps to be running ...
	I0831 15:31:04.297283    2876 request.go:632] Waited for 196.269925ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:31:04.297349    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:31:04.297359    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:04.297370    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:04.297380    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:04.301594    2876 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:31:04.305403    2876 system_pods.go:86] 17 kube-system pods found
	I0831 15:31:04.305414    2876 system_pods.go:89] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:31:04.305418    2876 system_pods.go:89] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:31:04.305421    2876 system_pods.go:89] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:31:04.305424    2876 system_pods.go:89] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:31:04.305427    2876 system_pods.go:89] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:31:04.305431    2876 system_pods.go:89] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:31:04.305434    2876 system_pods.go:89] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:31:04.305438    2876 system_pods.go:89] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:31:04.305440    2876 system_pods.go:89] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:31:04.305443    2876 system_pods.go:89] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:31:04.305446    2876 system_pods.go:89] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:31:04.305449    2876 system_pods.go:89] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:31:04.305452    2876 system_pods.go:89] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:31:04.305455    2876 system_pods.go:89] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:31:04.305457    2876 system_pods.go:89] "kube-vip-ha-949000" [933b8e54-299e-44c1-8dea-69aba92adbd4] Running
	I0831 15:31:04.305459    2876 system_pods.go:89] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:31:04.305462    2876 system_pods.go:89] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:31:04.305467    2876 system_pods.go:126] duration metric: took 204.496865ms to wait for k8s-apps to be running ...
	I0831 15:31:04.305472    2876 system_svc.go:44] waiting for kubelet service to be running ....
	I0831 15:31:04.305532    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:31:04.316332    2876 system_svc.go:56] duration metric: took 10.855844ms WaitForService to wait for kubelet
	I0831 15:31:04.316347    2876 kubeadm.go:582] duration metric: took 20.643776408s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:31:04.316359    2876 node_conditions.go:102] verifying NodePressure condition ...
	I0831 15:31:04.497360    2876 request.go:632] Waited for 180.939277ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0831 15:31:04.497396    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0831 15:31:04.497400    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:04.497406    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:04.497409    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:04.500112    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:04.500615    2876 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:31:04.500630    2876 node_conditions.go:123] node cpu capacity is 2
	I0831 15:31:04.500640    2876 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:31:04.500644    2876 node_conditions.go:123] node cpu capacity is 2
	I0831 15:31:04.500647    2876 node_conditions.go:105] duration metric: took 184.28246ms to run NodePressure ...
	I0831 15:31:04.500655    2876 start.go:241] waiting for startup goroutines ...
	I0831 15:31:04.500673    2876 start.go:255] writing updated cluster config ...
	I0831 15:31:04.522012    2876 out.go:201] 
	I0831 15:31:04.543188    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:31:04.543261    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:31:04.565062    2876 out.go:177] * Starting "ha-949000-m03" control-plane node in "ha-949000" cluster
	I0831 15:31:04.608029    2876 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:31:04.608097    2876 cache.go:56] Caching tarball of preloaded images
	I0831 15:31:04.608326    2876 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:31:04.608349    2876 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:31:04.608480    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:31:04.609474    2876 start.go:360] acquireMachinesLock for ha-949000-m03: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:31:04.609608    2876 start.go:364] duration metric: took 107.158µs to acquireMachinesLock for "ha-949000-m03"
	I0831 15:31:04.609644    2876 start.go:93] Provisioning new machine with config: &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ing
ress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror:
DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m03 IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:31:04.609770    2876 start.go:125] createHost starting for "m03" (driver="hyperkit")
	I0831 15:31:04.631012    2876 out.go:235] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0831 15:31:04.631142    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:31:04.631178    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:31:04.640831    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51128
	I0831 15:31:04.641212    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:31:04.641538    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:31:04.641551    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:31:04.641754    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:31:04.641864    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:31:04.641951    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:04.642054    2876 start.go:159] libmachine.API.Create for "ha-949000" (driver="hyperkit")
	I0831 15:31:04.642071    2876 client.go:168] LocalClient.Create starting
	I0831 15:31:04.642111    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem
	I0831 15:31:04.642169    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:31:04.642179    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:31:04.642217    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem
	I0831 15:31:04.642255    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:31:04.642264    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:31:04.642276    2876 main.go:141] libmachine: Running pre-create checks...
	I0831 15:31:04.642281    2876 main.go:141] libmachine: (ha-949000-m03) Calling .PreCreateCheck
	I0831 15:31:04.642379    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:04.642422    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetConfigRaw
	I0831 15:31:04.652222    2876 main.go:141] libmachine: Creating machine...
	I0831 15:31:04.652235    2876 main.go:141] libmachine: (ha-949000-m03) Calling .Create
	I0831 15:31:04.652380    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:04.652531    2876 main.go:141] libmachine: (ha-949000-m03) DBG | I0831 15:31:04.652372    3223 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:31:04.652595    2876 main.go:141] libmachine: (ha-949000-m03) Downloading /Users/jenkins/minikube-integration/18943-957/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso...
	I0831 15:31:04.967913    2876 main.go:141] libmachine: (ha-949000-m03) DBG | I0831 15:31:04.967796    3223 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa...
	I0831 15:31:05.218214    2876 main.go:141] libmachine: (ha-949000-m03) DBG | I0831 15:31:05.218148    3223 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/ha-949000-m03.rawdisk...
	I0831 15:31:05.218234    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Writing magic tar header
	I0831 15:31:05.218243    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Writing SSH key tar header
	I0831 15:31:05.219245    2876 main.go:141] libmachine: (ha-949000-m03) DBG | I0831 15:31:05.219093    3223 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03 ...
	I0831 15:31:05.777334    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:05.777394    2876 main.go:141] libmachine: (ha-949000-m03) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid
	I0831 15:31:05.777478    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Using UUID 3fdefe95-7552-4d5b-8412-6ae6e5c787bb
	I0831 15:31:05.805053    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Generated MAC fa:59:9e:3b:35:6d
	I0831 15:31:05.805071    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:31:05.805106    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"3fdefe95-7552-4d5b-8412-6ae6e5c787bb", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00011a5d0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:31:05.805131    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"3fdefe95-7552-4d5b-8412-6ae6e5c787bb", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00011a5d0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:31:05.805226    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "3fdefe95-7552-4d5b-8412-6ae6e5c787bb", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/ha-949000-m03.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:31:05.805279    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 3fdefe95-7552-4d5b-8412-6ae6e5c787bb -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/ha-949000-m03.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:31:05.805308    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:31:05.808244    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: Pid is 3227
	I0831 15:31:05.808817    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 0
	I0831 15:31:05.808830    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:05.808902    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:05.809826    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:05.809929    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:31:05.809949    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:31:05.809975    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:31:05.809992    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:31:05.810004    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:31:05.810013    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:31:05.816053    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:31:05.824689    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:31:05.825475    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:31:05.825495    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:31:05.825508    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:31:05.825518    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:31:06.214670    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:31:06.214691    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:31:06.330054    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:31:06.330074    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:31:06.330102    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:31:06.330119    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:31:06.330929    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:31:06.330943    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:31:07.810124    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 1
	I0831 15:31:07.810138    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:07.810246    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:07.811007    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:07.811057    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:31:07.811067    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:31:07.811076    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:31:07.811082    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:31:07.811088    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:31:07.811097    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:31:09.811187    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 2
	I0831 15:31:09.811200    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:09.811312    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:09.812186    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:09.812196    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:31:09.812205    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:31:09.812213    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:31:09.812234    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:31:09.812241    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:31:09.812249    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:31:11.813365    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 3
	I0831 15:31:11.813388    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:11.813446    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:11.814261    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:11.814310    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:31:11.814328    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:31:11.814337    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:31:11.814342    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:31:11.814361    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:31:11.814371    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:31:11.957428    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:11 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:31:11.957483    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:11 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:31:11.957496    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:11 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:31:11.981309    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:11 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:31:13.815231    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 4
	I0831 15:31:13.815245    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:13.815334    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:13.816118    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:13.816176    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:31:13.816186    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:31:13.816194    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:31:13.816200    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:31:13.816208    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:31:13.816220    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:31:15.816252    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 5
	I0831 15:31:15.816273    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:15.816393    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:15.817241    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:15.817305    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 6 entries in /var/db/dhcpd_leases!
	I0831 15:31:15.817315    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d4eb32}
	I0831 15:31:15.817332    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found match: fa:59:9e:3b:35:6d
	I0831 15:31:15.817339    2876 main.go:141] libmachine: (ha-949000-m03) DBG | IP: 192.169.0.7
	I0831 15:31:15.817379    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetConfigRaw
	I0831 15:31:15.817997    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:15.818096    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:15.818188    2876 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0831 15:31:15.818195    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetState
	I0831 15:31:15.818279    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:15.818331    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:15.819115    2876 main.go:141] libmachine: Detecting operating system of created instance...
	I0831 15:31:15.819122    2876 main.go:141] libmachine: Waiting for SSH to be available...
	I0831 15:31:15.819126    2876 main.go:141] libmachine: Getting to WaitForSSH function...
	I0831 15:31:15.819130    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:15.819211    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:15.819288    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:15.819367    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:15.819433    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:15.819544    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:15.819737    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:15.819744    2876 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0831 15:31:16.864414    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:31:16.864428    2876 main.go:141] libmachine: Detecting the provisioner...
	I0831 15:31:16.864434    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:16.864597    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:16.864686    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.864782    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.864877    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:16.865009    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:16.865163    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:16.865170    2876 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0831 15:31:16.911810    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0831 15:31:16.911850    2876 main.go:141] libmachine: found compatible host: buildroot
	I0831 15:31:16.911857    2876 main.go:141] libmachine: Provisioning with buildroot...
	I0831 15:31:16.911862    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:31:16.911989    2876 buildroot.go:166] provisioning hostname "ha-949000-m03"
	I0831 15:31:16.911998    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:31:16.912088    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:16.912161    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:16.912247    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.912323    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.912399    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:16.912532    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:16.912676    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:16.912685    2876 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m03 && echo "ha-949000-m03" | sudo tee /etc/hostname
	I0831 15:31:16.972401    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m03
	
	I0831 15:31:16.972418    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:16.972554    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:16.972683    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.972793    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.972889    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:16.973016    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:16.973150    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:16.973161    2876 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m03' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m03/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m03' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:31:17.026608    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:31:17.026626    2876 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:31:17.026635    2876 buildroot.go:174] setting up certificates
	I0831 15:31:17.026641    2876 provision.go:84] configureAuth start
	I0831 15:31:17.026647    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:31:17.026793    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:31:17.026903    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:17.026995    2876 provision.go:143] copyHostCerts
	I0831 15:31:17.027029    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:31:17.027088    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:31:17.027094    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:31:17.027236    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:31:17.027433    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:31:17.027471    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:31:17.027477    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:31:17.027559    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:31:17.027700    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:31:17.027737    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:31:17.027742    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:31:17.027813    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:31:17.027956    2876 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m03 san=[127.0.0.1 192.169.0.7 ha-949000-m03 localhost minikube]
	I0831 15:31:17.258292    2876 provision.go:177] copyRemoteCerts
	I0831 15:31:17.258340    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:31:17.258353    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:17.258490    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:17.258583    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.258663    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:17.258746    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:31:17.289869    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:31:17.289967    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:31:17.308984    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:31:17.309048    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0831 15:31:17.328947    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:31:17.329010    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:31:17.348578    2876 provision.go:87] duration metric: took 321.944434ms to configureAuth
	I0831 15:31:17.348592    2876 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:31:17.348776    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:31:17.348791    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:17.348926    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:17.349023    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:17.349112    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.349190    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.349267    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:17.349365    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:17.349505    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:17.349513    2876 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:31:17.396974    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:31:17.396988    2876 buildroot.go:70] root file system type: tmpfs
	I0831 15:31:17.397075    2876 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:31:17.397087    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:17.397218    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:17.397314    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.397402    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.397507    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:17.397637    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:17.397789    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:17.397838    2876 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:31:17.455821    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:31:17.455842    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:17.455977    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:17.456072    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.456168    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.456252    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:17.456374    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:17.456520    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:17.456533    2876 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:31:19.032300    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:31:19.032316    2876 main.go:141] libmachine: Checking connection to Docker...
	I0831 15:31:19.032323    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetURL
	I0831 15:31:19.032456    2876 main.go:141] libmachine: Docker is up and running!
	I0831 15:31:19.032464    2876 main.go:141] libmachine: Reticulating splines...
	I0831 15:31:19.032468    2876 client.go:171] duration metric: took 14.391172658s to LocalClient.Create
	I0831 15:31:19.032480    2876 start.go:167] duration metric: took 14.391215349s to libmachine.API.Create "ha-949000"
	I0831 15:31:19.032489    2876 start.go:293] postStartSetup for "ha-949000-m03" (driver="hyperkit")
	I0831 15:31:19.032496    2876 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:31:19.032506    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:19.032660    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:31:19.032675    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:19.032767    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:19.032855    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:19.032947    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:19.033033    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:31:19.073938    2876 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:31:19.079886    2876 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:31:19.079901    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:31:19.080017    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:31:19.080199    2876 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:31:19.080206    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:31:19.080413    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:31:19.092434    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:31:19.119963    2876 start.go:296] duration metric: took 87.46929ms for postStartSetup
	I0831 15:31:19.119990    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetConfigRaw
	I0831 15:31:19.120591    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:31:19.120767    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:31:19.121161    2876 start.go:128] duration metric: took 14.512164484s to createHost
	I0831 15:31:19.121177    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:19.121269    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:19.121343    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:19.121419    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:19.121507    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:19.121631    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:19.121747    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:19.121754    2876 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:31:19.168319    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143479.023948613
	
	I0831 15:31:19.168331    2876 fix.go:216] guest clock: 1725143479.023948613
	I0831 15:31:19.168337    2876 fix.go:229] Guest: 2024-08-31 15:31:19.023948613 -0700 PDT Remote: 2024-08-31 15:31:19.12117 -0700 PDT m=+129.881500927 (delta=-97.221387ms)
	I0831 15:31:19.168349    2876 fix.go:200] guest clock delta is within tolerance: -97.221387ms
	I0831 15:31:19.168354    2876 start.go:83] releasing machines lock for "ha-949000-m03", held for 14.559521208s
	I0831 15:31:19.168370    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:19.168508    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:31:19.193570    2876 out.go:177] * Found network options:
	I0831 15:31:19.255565    2876 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6
	W0831 15:31:19.295062    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:31:19.295088    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:31:19.295104    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:19.295822    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:19.296008    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:19.296101    2876 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:31:19.296130    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	W0831 15:31:19.296153    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:31:19.296165    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:31:19.296225    2876 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:31:19.296229    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:19.296236    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:19.296334    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:19.296350    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:19.296442    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:19.296455    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:19.296560    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:31:19.296581    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:19.296680    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	W0831 15:31:19.323572    2876 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:31:19.323629    2876 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:31:19.371272    2876 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:31:19.371294    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:31:19.371393    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:31:19.387591    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:31:19.396789    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:31:19.405160    2876 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:31:19.405208    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:31:19.413496    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:31:19.422096    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:31:19.430386    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:31:19.438699    2876 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:31:19.447187    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:31:19.455984    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:31:19.464947    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:31:19.474438    2876 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:31:19.482528    2876 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:31:19.490487    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:19.582349    2876 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:31:19.599985    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:31:19.600056    2876 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:31:19.612555    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:31:19.632269    2876 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:31:19.650343    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:31:19.661102    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:31:19.671812    2876 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:31:19.695791    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:31:19.706786    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:31:19.722246    2876 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:31:19.725125    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:31:19.732176    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:31:19.745845    2876 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:31:19.848832    2876 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:31:19.960260    2876 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:31:19.960281    2876 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:31:19.974005    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:20.073538    2876 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:31:22.469978    2876 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.396488217s)
	I0831 15:31:22.470044    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:31:22.482132    2876 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:31:22.494892    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:31:22.505113    2876 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:31:22.597737    2876 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:31:22.715451    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:22.823995    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:31:22.837904    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:31:22.849106    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:22.943937    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:31:23.002374    2876 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:31:23.002452    2876 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:31:23.006859    2876 start.go:563] Will wait 60s for crictl version
	I0831 15:31:23.006916    2876 ssh_runner.go:195] Run: which crictl
	I0831 15:31:23.010129    2876 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:31:23.037227    2876 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:31:23.037307    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:31:23.056021    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:31:23.095679    2876 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:31:23.119303    2876 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:31:23.162269    2876 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0831 15:31:23.183203    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:31:23.183553    2876 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:31:23.187788    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:31:23.197219    2876 mustload.go:65] Loading cluster: ha-949000
	I0831 15:31:23.197405    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:31:23.197647    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:31:23.197669    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:31:23.206705    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51151
	I0831 15:31:23.207061    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:31:23.207432    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:31:23.207448    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:31:23.207666    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:31:23.207786    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:31:23.207874    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:23.207946    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:31:23.208928    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:31:23.209186    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:31:23.209220    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:31:23.218074    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51153
	I0831 15:31:23.218433    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:31:23.218804    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:31:23.218819    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:31:23.219039    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:31:23.219165    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:31:23.219284    2876 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.7
	I0831 15:31:23.219289    2876 certs.go:194] generating shared ca certs ...
	I0831 15:31:23.219301    2876 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:31:23.219493    2876 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:31:23.219569    2876 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:31:23.219578    2876 certs.go:256] generating profile certs ...
	I0831 15:31:23.219685    2876 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:31:23.219705    2876 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.0c0868f3
	I0831 15:31:23.219719    2876 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.0c0868f3 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.7 192.169.0.254]
	I0831 15:31:23.437317    2876 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.0c0868f3 ...
	I0831 15:31:23.437340    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.0c0868f3: {Name:mk58aa028a0f003ebc9e4d90dc317cdac139f88f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:31:23.437643    2876 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.0c0868f3 ...
	I0831 15:31:23.437656    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.0c0868f3: {Name:mkaffb8ad3060932ca991ed93b1f8350d31a48ee Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:31:23.437859    2876 certs.go:381] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.0c0868f3 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt
	I0831 15:31:23.438064    2876 certs.go:385] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.0c0868f3 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key
	I0831 15:31:23.438321    2876 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:31:23.438330    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:31:23.438352    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:31:23.438370    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:31:23.438423    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:31:23.438445    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:31:23.438467    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:31:23.438484    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:31:23.438502    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:31:23.438598    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:31:23.438648    2876 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:31:23.438657    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:31:23.438698    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:31:23.438737    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:31:23.438775    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:31:23.438861    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:31:23.438902    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:31:23.438923    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:31:23.438941    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:31:23.438970    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:31:23.439126    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:31:23.439259    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:31:23.439370    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:31:23.439494    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:31:23.472129    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0831 15:31:23.475604    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0831 15:31:23.483468    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0831 15:31:23.486771    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0831 15:31:23.494732    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0831 15:31:23.497856    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0831 15:31:23.505900    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0831 15:31:23.509221    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1679 bytes)
	I0831 15:31:23.517853    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0831 15:31:23.521110    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0831 15:31:23.529522    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0831 15:31:23.532921    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0831 15:31:23.540561    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:31:23.560999    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:31:23.580941    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:31:23.601890    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:31:23.621742    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1444 bytes)
	I0831 15:31:23.642294    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0831 15:31:23.662119    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:31:23.682734    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:31:23.702621    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:31:23.722704    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:31:23.743032    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:31:23.763003    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0831 15:31:23.776540    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0831 15:31:23.790112    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0831 15:31:23.803743    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1679 bytes)
	I0831 15:31:23.817470    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0831 15:31:23.831871    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0831 15:31:23.845310    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0831 15:31:23.858947    2876 ssh_runner.go:195] Run: openssl version
	I0831 15:31:23.863254    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:31:23.871668    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:31:23.875114    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:31:23.875147    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:31:23.879499    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:31:23.888263    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:31:23.896800    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:31:23.900783    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:31:23.900840    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:31:23.905239    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:31:23.913677    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:31:23.921998    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:31:23.925382    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:31:23.925421    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:31:23.929547    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:31:23.938211    2876 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:31:23.941244    2876 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 15:31:23.941280    2876 kubeadm.go:934] updating node {m03 192.169.0.7 8443 v1.31.0 docker true true} ...
	I0831 15:31:23.941346    2876 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m03 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.7
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:31:23.941365    2876 kube-vip.go:115] generating kube-vip config ...
	I0831 15:31:23.941403    2876 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:31:23.953552    2876 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:31:23.953594    2876 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:31:23.953640    2876 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:31:23.961797    2876 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.31.0: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.31.0': No such file or directory
	
	Initiating transfer...
	I0831 15:31:23.961850    2876 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.31.0
	I0831 15:31:23.970244    2876 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubeadm.sha256
	I0831 15:31:23.970245    2876 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl.sha256
	I0831 15:31:23.970248    2876 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubelet.sha256
	I0831 15:31:23.970260    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm -> /var/lib/minikube/binaries/v1.31.0/kubeadm
	I0831 15:31:23.970262    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl -> /var/lib/minikube/binaries/v1.31.0/kubectl
	I0831 15:31:23.970297    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:31:23.970351    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubeadm
	I0831 15:31:23.970358    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubectl
	I0831 15:31:23.982898    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet -> /var/lib/minikube/binaries/v1.31.0/kubelet
	I0831 15:31:23.982926    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubeadm: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubeadm': No such file or directory
	I0831 15:31:23.982950    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm --> /var/lib/minikube/binaries/v1.31.0/kubeadm (58290328 bytes)
	I0831 15:31:23.982949    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubectl: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubectl': No such file or directory
	I0831 15:31:23.982968    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl --> /var/lib/minikube/binaries/v1.31.0/kubectl (56381592 bytes)
	I0831 15:31:23.983039    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubelet
	I0831 15:31:24.006648    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubelet: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubelet': No such file or directory
	I0831 15:31:24.006684    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet --> /var/lib/minikube/binaries/v1.31.0/kubelet (76865848 bytes)
	I0831 15:31:24.520609    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0831 15:31:24.528302    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:31:24.542845    2876 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:31:24.556549    2876 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:31:24.581157    2876 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:31:24.584179    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:31:24.593696    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:24.689916    2876 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:31:24.707403    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:31:24.707700    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:31:24.707728    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:31:24.717047    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51156
	I0831 15:31:24.717380    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:31:24.717728    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:31:24.717743    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:31:24.718003    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:31:24.718123    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:31:24.718213    2876 start.go:317] joinCluster: &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 Clu
sterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:fals
e inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimi
zations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:31:24.718336    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm token create --print-join-command --ttl=0"
	I0831 15:31:24.718349    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:31:24.718430    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:31:24.718495    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:31:24.718573    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:31:24.718638    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:31:24.810129    2876 start.go:343] trying to join control-plane node "m03" to cluster: &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:31:24.810181    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm join control-plane.minikube.internal:8443 --token l0ka7f.9kdk1py3wyogvy9t --discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-949000-m03 --control-plane --apiserver-advertise-address=192.169.0.7 --apiserver-bind-port=8443"
	I0831 15:31:52.526613    2876 ssh_runner.go:235] Completed: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm join control-plane.minikube.internal:8443 --token l0ka7f.9kdk1py3wyogvy9t --discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-949000-m03 --control-plane --apiserver-advertise-address=192.169.0.7 --apiserver-bind-port=8443": (27.716564604s)
	I0831 15:31:52.526639    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo systemctl daemon-reload && sudo systemctl enable kubelet && sudo systemctl start kubelet"
	I0831 15:31:53.011028    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-949000-m03 minikube.k8s.io/updated_at=2024_08_31T15_31_53_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2 minikube.k8s.io/name=ha-949000 minikube.k8s.io/primary=false
	I0831 15:31:53.087862    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig taint nodes ha-949000-m03 node-role.kubernetes.io/control-plane:NoSchedule-
	I0831 15:31:53.172826    2876 start.go:319] duration metric: took 28.454760565s to joinCluster
	I0831 15:31:53.172884    2876 start.go:235] Will wait 6m0s for node &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:31:53.173075    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:31:53.197446    2876 out.go:177] * Verifying Kubernetes components...
	I0831 15:31:53.254031    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:53.535623    2876 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:31:53.558317    2876 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:31:53.558557    2876 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x48c7c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:31:53.558593    2876 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:31:53.558836    2876 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m03" to be "Ready" ...
	I0831 15:31:53.558893    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:53.558899    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:53.558906    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:53.558909    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:53.561151    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:54.058994    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:54.059009    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:54.059015    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:54.059020    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:54.061381    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:54.559376    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:54.559389    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:54.559396    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:54.559399    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:54.561772    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:55.059628    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:55.059676    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:55.059690    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:55.059700    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:55.063078    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:55.559418    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:55.559433    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:55.559439    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:55.559442    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:55.561338    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:55.561664    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:31:56.059758    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:56.059770    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:56.059776    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:56.059780    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:56.061794    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:56.560083    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:56.560095    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:56.560101    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:56.560105    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:56.562114    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:57.058995    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:57.059011    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:57.059017    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:57.059021    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:57.060963    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:57.560137    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:57.560149    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:57.560155    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:57.560159    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:57.561978    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:57.562328    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:31:58.059061    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:58.059074    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:58.059080    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:58.059086    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:58.061472    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:58.559244    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:58.559270    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:58.559282    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:58.559289    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:58.562722    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:59.060308    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:59.060330    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:59.060342    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:59.060359    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:59.063517    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:59.560099    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:59.560116    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:59.560125    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:59.560129    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:59.562184    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:59.562628    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:00.059591    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:00.059615    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:00.059662    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:00.059677    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:00.063389    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:00.560430    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:00.560444    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:00.560451    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:00.560455    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:00.562483    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:01.059473    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:01.059498    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:01.059509    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:01.059514    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:01.062773    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:01.559271    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:01.559298    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:01.559310    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:01.559317    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:01.562641    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:01.563242    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:02.060140    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:02.060168    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:02.060211    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:02.060244    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:02.063601    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:02.559282    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:02.559308    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:02.559320    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:02.559329    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:02.562623    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:03.059890    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:03.059911    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:03.059923    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:03.059930    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:03.063409    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:03.559394    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:03.559453    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:03.559465    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:03.559470    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:03.562567    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:04.060698    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:04.060714    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:04.060719    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:04.060727    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:04.062955    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:04.063278    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:04.560096    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:04.560118    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:04.560165    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:04.560173    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:04.562791    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:05.060622    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:05.060648    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:05.060659    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:05.060665    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:05.064011    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:05.559954    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:05.559976    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:05.559988    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:05.559994    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:05.563422    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:06.059812    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:06.059870    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:06.059880    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:06.059886    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:06.062529    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:06.560071    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:06.560096    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:06.560107    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:06.560113    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:06.563538    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:06.564037    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:07.059298    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:07.059324    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:07.059335    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:07.059342    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:07.063048    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:07.559252    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:07.559277    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:07.559291    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:07.559297    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:07.562373    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:08.061149    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:08.061210    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:08.061223    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:08.061234    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:08.064402    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:08.559428    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:08.559452    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:08.559463    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:08.559468    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:08.562526    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:09.060827    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:09.060878    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:09.060891    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:09.060900    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:09.063954    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:09.064537    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:09.561212    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:09.561237    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:09.561283    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:09.561292    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:09.564677    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:10.060675    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:10.060694    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:10.060714    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:10.060718    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:10.062779    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:10.560397    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:10.560424    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:10.560435    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:10.560441    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:10.564079    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:11.060679    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:11.060705    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:11.060716    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:11.060722    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:11.064114    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:11.559466    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:11.559492    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:11.559503    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:11.559567    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:11.562752    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:11.563402    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:12.059348    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:12.059373    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:12.059384    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:12.059389    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:12.062810    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:12.561048    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:12.561106    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:12.561120    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:12.561141    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:12.564459    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:13.059831    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:13.059855    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.059867    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.059873    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.063079    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:13.063582    2876 node_ready.go:49] node "ha-949000-m03" has status "Ready":"True"
	I0831 15:32:13.063594    2876 node_ready.go:38] duration metric: took 19.504599366s for node "ha-949000-m03" to be "Ready" ...
	I0831 15:32:13.063602    2876 pod_ready.go:36] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:32:13.063657    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:32:13.063665    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.063674    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.063682    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.067458    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:13.072324    2876 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.072373    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:32:13.072379    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.072385    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.072389    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.074327    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.074802    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:13.074810    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.074815    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.074820    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.076654    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.076987    2876 pod_ready.go:93] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.076996    2876 pod_ready.go:82] duration metric: took 4.661444ms for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.077003    2876 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.077041    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-snq8s
	I0831 15:32:13.077046    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.077052    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.077056    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.078862    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.079264    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:13.079271    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.079277    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.079280    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.081027    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.081326    2876 pod_ready.go:93] pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.081335    2876 pod_ready.go:82] duration metric: took 4.326858ms for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.081342    2876 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.081372    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000
	I0831 15:32:13.081379    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.081385    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.081388    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.083263    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.083632    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:13.083639    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.083645    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.083649    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.085181    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.085480    2876 pod_ready.go:93] pod "etcd-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.085490    2876 pod_ready.go:82] duration metric: took 4.142531ms for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.085497    2876 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.085526    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m02
	I0831 15:32:13.085531    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.085537    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.085541    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.087128    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.087501    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:13.087508    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.087513    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.087518    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.088959    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.089244    2876 pod_ready.go:93] pod "etcd-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.089252    2876 pod_ready.go:82] duration metric: took 3.751049ms for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.089258    2876 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.261887    2876 request.go:632] Waited for 172.592535ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m03
	I0831 15:32:13.261972    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m03
	I0831 15:32:13.261978    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.262019    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.262028    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.264296    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:13.460589    2876 request.go:632] Waited for 195.842812ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:13.460724    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:13.460735    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.460745    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.460759    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.463962    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:13.464378    2876 pod_ready.go:93] pod "etcd-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.464391    2876 pod_ready.go:82] duration metric: took 375.12348ms for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.464404    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.661862    2876 request.go:632] Waited for 197.406518ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:32:13.661977    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:32:13.661988    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.661999    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.662005    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.665393    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:13.861181    2876 request.go:632] Waited for 195.385788ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:13.861214    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:13.861218    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.861225    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.861260    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.863261    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.863567    2876 pod_ready.go:93] pod "kube-apiserver-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.863577    2876 pod_ready.go:82] duration metric: took 399.161484ms for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.863584    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:14.061861    2876 request.go:632] Waited for 198.232413ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:32:14.061952    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:32:14.061961    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:14.061972    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:14.061979    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:14.064530    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:14.260004    2876 request.go:632] Waited for 194.98208ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:14.260143    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:14.260166    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:14.260182    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:14.260227    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:14.266580    2876 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:32:14.266908    2876 pod_ready.go:93] pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:14.266927    2876 pod_ready.go:82] duration metric: took 403.325368ms for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:14.266937    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:14.460025    2876 request.go:632] Waited for 193.045445ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:32:14.460093    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:32:14.460101    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:14.460110    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:14.460117    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:14.462588    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:14.660940    2876 request.go:632] Waited for 197.721547ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:14.661070    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:14.661080    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:14.661096    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:14.661109    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:14.664541    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:14.664954    2876 pod_ready.go:93] pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:14.664967    2876 pod_ready.go:82] duration metric: took 398.020825ms for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:14.664979    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:14.861147    2876 request.go:632] Waited for 196.115866ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:32:14.861203    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:32:14.861211    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:14.861223    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:14.861231    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:14.864847    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:15.060912    2876 request.go:632] Waited for 195.310518ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:15.060968    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:15.060983    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:15.061000    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:15.061011    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:15.064271    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:15.064583    2876 pod_ready.go:93] pod "kube-controller-manager-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:15.064594    2876 pod_ready.go:82] duration metric: took 399.604845ms for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:15.064603    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:15.260515    2876 request.go:632] Waited for 195.841074ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:32:15.260662    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:32:15.260676    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:15.260688    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:15.260702    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:15.264411    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:15.461372    2876 request.go:632] Waited for 196.432681ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:15.461470    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:15.461484    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:15.461502    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:15.461513    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:15.464382    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:15.464683    2876 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:15.464691    2876 pod_ready.go:82] duration metric: took 400.078711ms for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:15.464700    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:15.660288    2876 request.go:632] Waited for 195.551444ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:32:15.660318    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:32:15.660323    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:15.660357    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:15.660363    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:15.663247    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:15.860473    2876 request.go:632] Waited for 196.823661ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:15.860532    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:15.860542    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:15.860556    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:15.860563    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:15.863954    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:15.864333    2876 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:15.864346    2876 pod_ready.go:82] duration metric: took 399.636293ms for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:15.864355    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:16.060306    2876 request.go:632] Waited for 195.900703ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:32:16.060410    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:32:16.060437    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:16.060449    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:16.060455    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:16.063745    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:16.260402    2876 request.go:632] Waited for 195.997957ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:16.260523    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:16.260539    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:16.260551    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:16.260563    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:16.264052    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:16.264373    2876 pod_ready.go:93] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:16.264385    2876 pod_ready.go:82] duration metric: took 400.01997ms for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:16.264394    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:16.461128    2876 request.go:632] Waited for 196.682855ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-d45q5
	I0831 15:32:16.461251    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-d45q5
	I0831 15:32:16.461264    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:16.461275    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:16.461282    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:16.464602    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:16.660248    2876 request.go:632] Waited for 195.08291ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:16.660298    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:16.660310    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:16.660327    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:16.660340    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:16.663471    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:16.664017    2876 pod_ready.go:93] pod "kube-proxy-d45q5" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:16.664029    2876 pod_ready.go:82] duration metric: took 399.623986ms for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:16.664038    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:16.859948    2876 request.go:632] Waited for 195.845325ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:32:16.860034    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:32:16.860060    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:16.860083    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:16.860094    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:16.863263    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:17.060250    2876 request.go:632] Waited for 196.410574ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:17.060307    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:17.060319    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:17.060334    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:17.060345    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:17.063664    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:17.064113    2876 pod_ready.go:93] pod "kube-proxy-q7ndn" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:17.064125    2876 pod_ready.go:82] duration metric: took 400.076522ms for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:17.064134    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:17.260150    2876 request.go:632] Waited for 195.935266ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:32:17.260232    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:32:17.260246    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:17.260305    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:17.260324    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:17.263756    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:17.460703    2876 request.go:632] Waited for 196.426241ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:17.460753    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:17.460765    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:17.460776    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:17.460799    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:17.463925    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:17.464439    2876 pod_ready.go:93] pod "kube-scheduler-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:17.464449    2876 pod_ready.go:82] duration metric: took 400.306164ms for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:17.464463    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:17.660506    2876 request.go:632] Waited for 196.00354ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:32:17.660541    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:32:17.660547    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:17.660553    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:17.660568    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:17.662504    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:17.859973    2876 request.go:632] Waited for 197.106962ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:17.860023    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:17.860031    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:17.860084    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:17.860092    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:17.869330    2876 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0831 15:32:17.869629    2876 pod_ready.go:93] pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:17.869638    2876 pod_ready.go:82] duration metric: took 405.16449ms for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:17.869646    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:18.060370    2876 request.go:632] Waited for 190.671952ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:32:18.060479    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:32:18.060492    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.060504    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.060511    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.063196    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:18.260902    2876 request.go:632] Waited for 197.387182ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:18.260947    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:18.260955    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.260976    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.261000    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.263780    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:18.264154    2876 pod_ready.go:93] pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:18.264163    2876 pod_ready.go:82] duration metric: took 394.508983ms for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:18.264171    2876 pod_ready.go:39] duration metric: took 5.200505122s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:32:18.264182    2876 api_server.go:52] waiting for apiserver process to appear ...
	I0831 15:32:18.264235    2876 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:32:18.276016    2876 api_server.go:72] duration metric: took 25.102905505s to wait for apiserver process to appear ...
	I0831 15:32:18.276029    2876 api_server.go:88] waiting for apiserver healthz status ...
	I0831 15:32:18.276040    2876 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0831 15:32:18.280474    2876 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0831 15:32:18.280519    2876 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0831 15:32:18.280525    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.280531    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.280535    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.281148    2876 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 15:32:18.281176    2876 api_server.go:141] control plane version: v1.31.0
	I0831 15:32:18.281184    2876 api_server.go:131] duration metric: took 5.150155ms to wait for apiserver health ...
	I0831 15:32:18.281189    2876 system_pods.go:43] waiting for kube-system pods to appear ...
	I0831 15:32:18.460471    2876 request.go:632] Waited for 179.236076ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:32:18.460573    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:32:18.460585    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.460596    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.460604    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.465317    2876 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:32:18.469906    2876 system_pods.go:59] 24 kube-system pods found
	I0831 15:32:18.469918    2876 system_pods.go:61] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:32:18.469921    2876 system_pods.go:61] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:32:18.469925    2876 system_pods.go:61] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:32:18.469928    2876 system_pods.go:61] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:32:18.469933    2876 system_pods.go:61] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:32:18.469937    2876 system_pods.go:61] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:32:18.469939    2876 system_pods.go:61] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:32:18.469943    2876 system_pods.go:61] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:32:18.469946    2876 system_pods.go:61] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:32:18.469949    2876 system_pods.go:61] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:32:18.469954    2876 system_pods.go:61] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:32:18.469958    2876 system_pods.go:61] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:32:18.469961    2876 system_pods.go:61] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:32:18.469963    2876 system_pods.go:61] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:32:18.469966    2876 system_pods.go:61] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:32:18.469969    2876 system_pods.go:61] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:32:18.469972    2876 system_pods.go:61] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:32:18.469975    2876 system_pods.go:61] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:32:18.469978    2876 system_pods.go:61] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:32:18.469980    2876 system_pods.go:61] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:32:18.469983    2876 system_pods.go:61] "kube-vip-ha-949000" [933b8e54-299e-44c1-8dea-69aba92adbd4] Running
	I0831 15:32:18.469985    2876 system_pods.go:61] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:32:18.469988    2876 system_pods.go:61] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:32:18.469990    2876 system_pods.go:61] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:32:18.469994    2876 system_pods.go:74] duration metric: took 188.799972ms to wait for pod list to return data ...
	I0831 15:32:18.470000    2876 default_sa.go:34] waiting for default service account to be created ...
	I0831 15:32:18.659945    2876 request.go:632] Waited for 189.894855ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:32:18.659986    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:32:18.660002    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.660011    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.660017    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.662843    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:18.662901    2876 default_sa.go:45] found service account: "default"
	I0831 15:32:18.662910    2876 default_sa.go:55] duration metric: took 192.903479ms for default service account to be created ...
	I0831 15:32:18.662915    2876 system_pods.go:116] waiting for k8s-apps to be running ...
	I0831 15:32:18.860267    2876 request.go:632] Waited for 197.296928ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:32:18.860299    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:32:18.860304    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.860310    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.860316    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.864052    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:18.868873    2876 system_pods.go:86] 24 kube-system pods found
	I0831 15:32:18.868886    2876 system_pods.go:89] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:32:18.868891    2876 system_pods.go:89] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:32:18.868894    2876 system_pods.go:89] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:32:18.868897    2876 system_pods.go:89] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:32:18.868901    2876 system_pods.go:89] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:32:18.868904    2876 system_pods.go:89] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:32:18.868907    2876 system_pods.go:89] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:32:18.868912    2876 system_pods.go:89] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:32:18.868916    2876 system_pods.go:89] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:32:18.868918    2876 system_pods.go:89] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:32:18.868922    2876 system_pods.go:89] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:32:18.868927    2876 system_pods.go:89] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:32:18.868931    2876 system_pods.go:89] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:32:18.868934    2876 system_pods.go:89] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:32:18.868938    2876 system_pods.go:89] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:32:18.868941    2876 system_pods.go:89] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:32:18.868944    2876 system_pods.go:89] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:32:18.868947    2876 system_pods.go:89] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:32:18.868950    2876 system_pods.go:89] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:32:18.868953    2876 system_pods.go:89] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:32:18.868957    2876 system_pods.go:89] "kube-vip-ha-949000" [933b8e54-299e-44c1-8dea-69aba92adbd4] Running
	I0831 15:32:18.868959    2876 system_pods.go:89] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:32:18.868963    2876 system_pods.go:89] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:32:18.868966    2876 system_pods.go:89] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:32:18.868971    2876 system_pods.go:126] duration metric: took 206.049826ms to wait for k8s-apps to be running ...
	I0831 15:32:18.868980    2876 system_svc.go:44] waiting for kubelet service to be running ....
	I0831 15:32:18.869030    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:32:18.880958    2876 system_svc.go:56] duration metric: took 11.976044ms WaitForService to wait for kubelet
	I0831 15:32:18.880978    2876 kubeadm.go:582] duration metric: took 25.707859659s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:32:18.880990    2876 node_conditions.go:102] verifying NodePressure condition ...
	I0831 15:32:19.060320    2876 request.go:632] Waited for 179.26426ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0831 15:32:19.060365    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0831 15:32:19.060371    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:19.060379    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:19.060385    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:19.063168    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:19.063767    2876 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:32:19.063776    2876 node_conditions.go:123] node cpu capacity is 2
	I0831 15:32:19.063782    2876 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:32:19.063785    2876 node_conditions.go:123] node cpu capacity is 2
	I0831 15:32:19.063789    2876 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:32:19.063791    2876 node_conditions.go:123] node cpu capacity is 2
	I0831 15:32:19.063794    2876 node_conditions.go:105] duration metric: took 182.798166ms to run NodePressure ...
	I0831 15:32:19.063802    2876 start.go:241] waiting for startup goroutines ...
	I0831 15:32:19.063817    2876 start.go:255] writing updated cluster config ...
	I0831 15:32:19.064186    2876 ssh_runner.go:195] Run: rm -f paused
	I0831 15:32:19.107477    2876 start.go:600] kubectl: 1.29.2, cluster: 1.31.0 (minor skew: 2)
	I0831 15:32:19.128559    2876 out.go:201] 
	W0831 15:32:19.149451    2876 out.go:270] ! /usr/local/bin/kubectl is version 1.29.2, which may have incompatibilities with Kubernetes 1.31.0.
	I0831 15:32:19.170407    2876 out.go:177]   - Want kubectl v1.31.0? Try 'minikube kubectl -- get pods -A'
	I0831 15:32:19.212551    2876 out.go:177] * Done! kubectl is now configured to use "ha-949000" cluster and "default" namespace by default
	
	
	==> Docker <==
	Aug 31 22:30:08 ha-949000 cri-dockerd[1172]: time="2024-08-31T22:30:08Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/7da75377db13c80b27b99ccc9f52561a4408675361947cf393e0c38286a71997/resolv.conf as [nameserver 192.169.0.1]"
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.201910840Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.202112013Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.202132705Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.202328611Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:30:08 ha-949000 cri-dockerd[1172]: time="2024-08-31T22:30:08Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/1017bd5eac1d26de2df318c0dc0ac8d5db92d72e8c268401502a145b3ad0d9d8/resolv.conf as [nameserver 192.169.0.1]"
	Aug 31 22:30:08 ha-949000 cri-dockerd[1172]: time="2024-08-31T22:30:08Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/271da20951c9ab4102e979dc2b97b3a9c8d992db5fc7ebac3f954ea9edee9d48/resolv.conf as [nameserver 192.169.0.1]"
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.346950244Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.347136993Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.347223771Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.347348772Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.379063396Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.379210402Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.379226413Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.379336044Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:32:21 ha-949000 dockerd[1279]: time="2024-08-31T22:32:21.320619490Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:32:21 ha-949000 dockerd[1279]: time="2024-08-31T22:32:21.320945499Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:32:21 ha-949000 dockerd[1279]: time="2024-08-31T22:32:21.321018153Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:32:21 ha-949000 dockerd[1279]: time="2024-08-31T22:32:21.321131565Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:32:21 ha-949000 cri-dockerd[1172]: time="2024-08-31T22:32:21Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/f68483c946835415bfdf0531bfc6be41dd321162f4c19af555ece0f66ee7cabe/resolv.conf as [nameserver 10.96.0.10 search default.svc.cluster.local svc.cluster.local cluster.local options ndots:5]"
	Aug 31 22:32:22 ha-949000 cri-dockerd[1172]: time="2024-08-31T22:32:22Z" level=info msg="Stop pulling image gcr.io/k8s-minikube/busybox:1.28: Status: Downloaded newer image for gcr.io/k8s-minikube/busybox:1.28"
	Aug 31 22:32:22 ha-949000 dockerd[1279]: time="2024-08-31T22:32:22.716842379Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:32:22 ha-949000 dockerd[1279]: time="2024-08-31T22:32:22.716906766Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:32:22 ha-949000 dockerd[1279]: time="2024-08-31T22:32:22.716920530Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:32:22 ha-949000 dockerd[1279]: time="2024-08-31T22:32:22.721236974Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                 CREATED              STATE               NAME                      ATTEMPT             POD ID              POD
	2f925f16b74b0       gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12   About a minute ago   Running             busybox                   0                   f68483c946835       busybox-7dff88458-5kkbw
	b1db836cd7a3d       cbb01a7bd410d                                                                                         3 minutes ago        Running             coredns                   0                   271da20951c9a       coredns-6f6b679f8f-kjszm
	def4d6bd20bc5       cbb01a7bd410d                                                                                         3 minutes ago        Running             coredns                   0                   1017bd5eac1d2       coredns-6f6b679f8f-snq8s
	22fbb8a8e01ad       6e38f40d628db                                                                                         3 minutes ago        Running             storage-provisioner       0                   7da75377db13c       storage-provisioner
	6d156ce626115       kindest/kindnetd@sha256:e59a687ca28ae274a2fc92f1e2f5f1c739f353178a43a23aafc71adb802ed166              3 minutes ago        Running             kindnet-cni               0                   7d1851c17485c       kindnet-jzj42
	54d5f8041c89d       ad83b2ca7b09e                                                                                         3 minutes ago        Running             kube-proxy                0                   4b0198ac7dc52       kube-proxy-q7ndn
	c99fe831b20c1       ghcr.io/kube-vip/kube-vip@sha256:360f0c5d02322075cc80edb9e4e0d2171e941e55072184f1f902203fafc81d0f     4 minutes ago        Running             kube-vip                  0                   9ef7e0fa361d5       kube-vip-ha-949000
	c734c23a53082       2e96e5913fc06                                                                                         4 minutes ago        Running             etcd                      0                   7cfaf9f5d4dd4       etcd-ha-949000
	02c10e4f765d1       1766f54c897f0                                                                                         4 minutes ago        Running             kube-scheduler            0                   c084f2a259f6c       kube-scheduler-ha-949000
	6670fd34164cb       045733566833c                                                                                         4 minutes ago        Running             kube-controller-manager   0                   f9573e28f9d4d       kube-controller-manager-ha-949000
	ffec6106be6c8       604f5db92eaa8                                                                                         4 minutes ago        Running             kube-apiserver            0                   25c49852f78dc       kube-apiserver-ha-949000
	
	
	==> coredns [b1db836cd7a3] <==
	[INFO] 10.244.1.2:56414 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000107837s
	[INFO] 10.244.1.2:53184 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000079726s
	[INFO] 10.244.1.2:58757 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000418868s
	[INFO] 10.244.1.2:39299 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000067106s
	[INFO] 10.244.2.2:56948 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000080585s
	[INFO] 10.244.2.2:56973 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000078985s
	[INFO] 10.244.2.2:43081 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000100123s
	[INFO] 10.244.2.2:56390 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000040214s
	[INFO] 10.244.2.2:52519 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000061255s
	[INFO] 10.244.0.4:36226 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000151133s
	[INFO] 10.244.1.2:44017 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089111s
	[INFO] 10.244.1.2:37224 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000069144s
	[INFO] 10.244.1.2:51282 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000118723s
	[INFO] 10.244.2.2:35009 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089507s
	[INFO] 10.244.2.2:60607 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000049176s
	[INFO] 10.244.2.2:36851 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000097758s
	[INFO] 10.244.0.4:59717 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000053986s
	[INFO] 10.244.0.4:58447 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000060419s
	[INFO] 10.244.1.2:60381 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000136898s
	[INFO] 10.244.1.2:32783 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.00010303s
	[INFO] 10.244.1.2:44904 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000042493s
	[INFO] 10.244.1.2:44085 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000132084s
	[INFO] 10.244.2.2:43635 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000080947s
	[INFO] 10.244.2.2:40020 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000081919s
	[INFO] 10.244.2.2:53730 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000058015s
	
	
	==> coredns [def4d6bd20bc] <==
	[INFO] 10.244.0.4:41865 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 140 0.008744161s
	[INFO] 10.244.1.2:50080 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000093199s
	[INFO] 10.244.1.2:55576 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.000574417s
	[INFO] 10.244.1.2:36293 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000065455s
	[INFO] 10.244.2.2:41223 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000063892s
	[INFO] 10.244.0.4:54135 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000096141s
	[INFO] 10.244.0.4:39176 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000742646s
	[INFO] 10.244.0.4:58445 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000080113s
	[INFO] 10.244.0.4:56242 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000066269s
	[INFO] 10.244.0.4:60657 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000049645s
	[INFO] 10.244.1.2:48306 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000561931s
	[INFO] 10.244.1.2:40767 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000077826s
	[INFO] 10.244.1.2:35669 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000056994s
	[INFO] 10.244.1.2:57720 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000040565s
	[INFO] 10.244.2.2:38794 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000136901s
	[INFO] 10.244.2.2:33576 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000052374s
	[INFO] 10.244.2.2:57053 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000051289s
	[INFO] 10.244.0.4:47623 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000056903s
	[INFO] 10.244.0.4:59818 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.00003011s
	[INFO] 10.244.0.4:53586 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000029565s
	[INFO] 10.244.1.2:60045 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000060878s
	[INFO] 10.244.2.2:38400 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000078624s
	[INFO] 10.244.0.4:58765 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000075707s
	[INFO] 10.244.0.4:32804 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000050785s
	[INFO] 10.244.2.2:48459 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.00007773s
	
	
	==> describe nodes <==
	Name:               ha-949000
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-949000
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=ha-949000
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_08_31T15_29_45_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:29:41 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-949000
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:33:48 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:32:48 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:32:48 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:32:48 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:32:48 +0000   Sat, 31 Aug 2024 22:30:07 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.5
	  Hostname:    ha-949000
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 e8535f0b09e14aea8b2456a9d977fc80
	  System UUID:                98ca49d1-0000-0000-9e6c-321a4533d56e
	  Boot ID:                    4896b77b-e0f4-43c0-af0e-3998b4352bec
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-5kkbw              0 (0%)        0 (0%)      0 (0%)           0 (0%)         89s
	  kube-system                 coredns-6f6b679f8f-kjszm             100m (5%)     0 (0%)      70Mi (3%)        170Mi (8%)     4m
	  kube-system                 coredns-6f6b679f8f-snq8s             100m (5%)     0 (0%)      70Mi (3%)        170Mi (8%)     4m
	  kube-system                 etcd-ha-949000                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         4m5s
	  kube-system                 kindnet-jzj42                        100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      4m1s
	  kube-system                 kube-apiserver-ha-949000             250m (12%)    0 (0%)      0 (0%)           0 (0%)         4m6s
	  kube-system                 kube-controller-manager-ha-949000    200m (10%)    0 (0%)      0 (0%)           0 (0%)         4m5s
	  kube-system                 kube-proxy-q7ndn                     0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m1s
	  kube-system                 kube-scheduler-ha-949000             100m (5%)     0 (0%)      0 (0%)           0 (0%)         4m7s
	  kube-system                 kube-vip-ha-949000                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m7s
	  kube-system                 storage-provisioner                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m1s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%)   100m (5%)
	  memory             290Mi (13%)  390Mi (18%)
	  ephemeral-storage  0 (0%)       0 (0%)
	  hugepages-2Mi      0 (0%)       0 (0%)
	Events:
	  Type    Reason                   Age    From             Message
	  ----    ------                   ----   ----             -------
	  Normal  Starting                 3m59s  kube-proxy       
	  Normal  Starting                 4m5s   kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  4m5s   kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  4m5s   kubelet          Node ha-949000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    4m5s   kubelet          Node ha-949000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     4m5s   kubelet          Node ha-949000 status is now: NodeHasSufficientPID
	  Normal  RegisteredNode           4m1s   node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  NodeReady                3m42s  kubelet          Node ha-949000 status is now: NodeReady
	  Normal  RegisteredNode           3m1s   node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           111s   node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	
	
	Name:               ha-949000-m02
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-949000-m02
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=ha-949000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_08_31T15_30_43_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:30:41 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-949000-m02
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:33:44 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:32:43 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:32:43 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:32:43 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:32:43 +0000   Sat, 31 Aug 2024 22:31:00 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.6
	  Hostname:    ha-949000-m02
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 31d5d81c627e4d65bfa15e4c54f7f7c1
	  System UUID:                23e54f3d-0000-0000-86b7-b25c818528d1
	  Boot ID:                    021c5fd3-b441-490e-ac27-d927c00459f2
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-6r9s5                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         89s
	  kube-system                 etcd-ha-949000-m02                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         3m6s
	  kube-system                 kindnet-brtj6                            100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      3m8s
	  kube-system                 kube-apiserver-ha-949000-m02             250m (12%)    0 (0%)      0 (0%)           0 (0%)         3m6s
	  kube-system                 kube-controller-manager-ha-949000-m02    200m (10%)    0 (0%)      0 (0%)           0 (0%)         3m3s
	  kube-system                 kube-proxy-4r2bt                         0 (0%)        0 (0%)      0 (0%)           0 (0%)         3m8s
	  kube-system                 kube-scheduler-ha-949000-m02             100m (5%)     0 (0%)      0 (0%)           0 (0%)         3m2s
	  kube-system                 kube-vip-ha-949000-m02                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         3m4s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%)  100m (5%)
	  memory             150Mi (7%)  50Mi (2%)
	  ephemeral-storage  0 (0%)      0 (0%)
	  hugepages-2Mi      0 (0%)      0 (0%)
	Events:
	  Type    Reason                   Age                  From             Message
	  ----    ------                   ----                 ----             -------
	  Normal  Starting                 3m3s                 kube-proxy       
	  Normal  NodeHasSufficientMemory  3m8s (x8 over 3m8s)  kubelet          Node ha-949000-m02 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    3m8s (x8 over 3m8s)  kubelet          Node ha-949000-m02 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     3m8s (x7 over 3m8s)  kubelet          Node ha-949000-m02 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  3m8s                 kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           3m6s                 node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal  RegisteredNode           3m1s                 node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal  RegisteredNode           111s                 node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	
	
	Name:               ha-949000-m03
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-949000-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=ha-949000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_08_31T15_31_53_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:31:50 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-949000-m03
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:33:42 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:32:52 +0000   Sat, 31 Aug 2024 22:31:50 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:32:52 +0000   Sat, 31 Aug 2024 22:31:50 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:32:52 +0000   Sat, 31 Aug 2024 22:31:50 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:32:52 +0000   Sat, 31 Aug 2024 22:32:13 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.7
	  Hostname:    ha-949000-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 0aea5b50957a40edad0152e71b7f3a2a
	  System UUID:                3fde4d5b-0000-0000-8412-6ae6e5c787bb
	  Boot ID:                    2d4c31ca-c268-4eb4-ad45-716d78aaaa5c
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.2.0/24
	PodCIDRs:                     10.244.2.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-vjf9x                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         89s
	  kube-system                 etcd-ha-949000-m03                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         116s
	  kube-system                 kindnet-9j85v                            100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      119s
	  kube-system                 kube-apiserver-ha-949000-m03             250m (12%)    0 (0%)      0 (0%)           0 (0%)         116s
	  kube-system                 kube-controller-manager-ha-949000-m03    200m (10%)    0 (0%)      0 (0%)           0 (0%)         118s
	  kube-system                 kube-proxy-d45q5                         0 (0%)        0 (0%)      0 (0%)           0 (0%)         119s
	  kube-system                 kube-scheduler-ha-949000-m03             100m (5%)     0 (0%)      0 (0%)           0 (0%)         118s
	  kube-system                 kube-vip-ha-949000-m03                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         115s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%)  100m (5%)
	  memory             150Mi (7%)  50Mi (2%)
	  ephemeral-storage  0 (0%)      0 (0%)
	  hugepages-2Mi      0 (0%)      0 (0%)
	Events:
	  Type    Reason                   Age                  From             Message
	  ----    ------                   ----                 ----             -------
	  Normal  Starting                 114s                 kube-proxy       
	  Normal  NodeHasSufficientMemory  119s (x8 over 119s)  kubelet          Node ha-949000-m03 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    119s (x8 over 119s)  kubelet          Node ha-949000-m03 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     119s (x7 over 119s)  kubelet          Node ha-949000-m03 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  119s                 kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           116s                 node-controller  Node ha-949000-m03 event: Registered Node ha-949000-m03 in Controller
	  Normal  RegisteredNode           116s                 node-controller  Node ha-949000-m03 event: Registered Node ha-949000-m03 in Controller
	  Normal  RegisteredNode           111s                 node-controller  Node ha-949000-m03 event: Registered Node ha-949000-m03 in Controller
	
	
	==> dmesg <==
	[  +2.774485] systemd-fstab-generator[127]: Ignoring "noauto" option for root device
	[  +2.237441] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000003] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000001] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +0.596627] systemd-fstab-generator[494]: Ignoring "noauto" option for root device
	[  +0.090743] systemd-fstab-generator[506]: Ignoring "noauto" option for root device
	[  +1.756564] systemd-fstab-generator[845]: Ignoring "noauto" option for root device
	[  +0.273405] systemd-fstab-generator[883]: Ignoring "noauto" option for root device
	[  +0.102089] systemd-fstab-generator[895]: Ignoring "noauto" option for root device
	[  +0.058959] kauditd_printk_skb: 115 callbacks suppressed
	[  +0.059797] systemd-fstab-generator[909]: Ignoring "noauto" option for root device
	[  +2.526421] systemd-fstab-generator[1125]: Ignoring "noauto" option for root device
	[  +0.100331] systemd-fstab-generator[1137]: Ignoring "noauto" option for root device
	[  +0.099114] systemd-fstab-generator[1149]: Ignoring "noauto" option for root device
	[  +0.141519] systemd-fstab-generator[1164]: Ignoring "noauto" option for root device
	[  +3.497423] systemd-fstab-generator[1265]: Ignoring "noauto" option for root device
	[  +0.066902] kauditd_printk_skb: 158 callbacks suppressed
	[  +2.572406] systemd-fstab-generator[1521]: Ignoring "noauto" option for root device
	[  +3.569896] systemd-fstab-generator[1651]: Ignoring "noauto" option for root device
	[  +0.054418] kauditd_printk_skb: 70 callbacks suppressed
	[  +7.004094] systemd-fstab-generator[2150]: Ignoring "noauto" option for root device
	[  +0.086539] kauditd_printk_skb: 72 callbacks suppressed
	[  +5.400345] kauditd_printk_skb: 12 callbacks suppressed
	[  +5.311598] kauditd_printk_skb: 29 callbacks suppressed
	[Aug31 22:30] kauditd_printk_skb: 26 callbacks suppressed
	
	
	==> etcd [c734c23a5308] <==
	{"level":"info","ts":"2024-08-31T22:30:42.586467Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:30:43.071231Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 switched to configuration voters=(3559962241544385584 13314548521573537860)"}
	{"level":"info","ts":"2024-08-31T22:30:43.071481Z","caller":"membership/cluster.go:535","msg":"promote member","cluster-id":"b73189effde9bc63","local-member-id":"b8c6c7563d17d844"}
	{"level":"info","ts":"2024-08-31T22:30:43.071678Z","caller":"etcdserver/server.go:1996","msg":"applied a configuration change through raft","local-member-id":"b8c6c7563d17d844","raft-conf-change":"ConfChangeAddNode","raft-conf-change-node-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:31:50.552948Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 switched to configuration voters=(3559962241544385584 13314548521573537860) learners=(485493211181035330)"}
	{"level":"info","ts":"2024-08-31T22:31:50.553563Z","caller":"membership/cluster.go:421","msg":"added member","cluster-id":"b73189effde9bc63","local-member-id":"b8c6c7563d17d844","added-peer-id":"6bcd180d94f2f42","added-peer-peer-urls":["https://192.169.0.7:2380"]}
	{"level":"info","ts":"2024-08-31T22:31:50.553811Z","caller":"rafthttp/peer.go:133","msg":"starting remote peer","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:31:50.553888Z","caller":"rafthttp/pipeline.go:72","msg":"started HTTP pipelining with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:31:50.563089Z","caller":"rafthttp/stream.go:169","msg":"started stream writer with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:31:50.563597Z","caller":"rafthttp/peer.go:137","msg":"started remote peer","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:31:50.563782Z","caller":"rafthttp/transport.go:317","msg":"added remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42","remote-peer-urls":["https://192.169.0.7:2380"]}
	{"level":"info","ts":"2024-08-31T22:31:50.563934Z","caller":"rafthttp/stream.go:169","msg":"started stream writer with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:31:50.564027Z","caller":"rafthttp/stream.go:395","msg":"started stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:31:50.564274Z","caller":"rafthttp/stream.go:395","msg":"started stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"warn","ts":"2024-08-31T22:31:51.592382Z","caller":"etcdhttp/peer.go:150","msg":"failed to promote a member","member-id":"6bcd180d94f2f42","error":"etcdserver: can only promote a learner member which is in sync with leader"}
	{"level":"info","ts":"2024-08-31T22:31:51.796182Z","caller":"rafthttp/peer_status.go:53","msg":"peer became active","peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:31:51.801097Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:31:51.801930Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:31:51.814490Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"6bcd180d94f2f42","stream-type":"stream MsgApp v2"}
	{"level":"info","ts":"2024-08-31T22:31:51.814527Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:31:51.822457Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"6bcd180d94f2f42","stream-type":"stream Message"}
	{"level":"info","ts":"2024-08-31T22:31:51.822549Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:31:52.588081Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 switched to configuration voters=(485493211181035330 3559962241544385584 13314548521573537860)"}
	{"level":"info","ts":"2024-08-31T22:31:52.588433Z","caller":"membership/cluster.go:535","msg":"promote member","cluster-id":"b73189effde9bc63","local-member-id":"b8c6c7563d17d844"}
	{"level":"info","ts":"2024-08-31T22:31:52.588653Z","caller":"etcdserver/server.go:1996","msg":"applied a configuration change through raft","local-member-id":"b8c6c7563d17d844","raft-conf-change":"ConfChangeAddNode","raft-conf-change-node-id":"6bcd180d94f2f42"}
	
	
	==> kernel <==
	 22:33:49 up 4 min,  0 users,  load average: 0.46, 0.23, 0.11
	Linux ha-949000 5.10.207 #1 SMP Wed Aug 28 20:54:17 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [6d156ce62611] <==
	I0831 22:33:05.622835       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:33:15.619953       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:33:15.620161       1 main.go:299] handling current node
	I0831 22:33:15.620244       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:33:15.620403       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:33:15.620694       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:33:15.620783       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:33:25.614304       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:33:25.614589       1 main.go:299] handling current node
	I0831 22:33:25.614804       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:33:25.615060       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:33:25.615515       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:33:25.615641       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:33:35.620070       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:33:35.620108       1 main.go:299] handling current node
	I0831 22:33:35.620119       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:33:35.620124       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:33:35.620269       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:33:35.620297       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:33:45.620982       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:33:45.621246       1 main.go:299] handling current node
	I0831 22:33:45.621372       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:33:45.621475       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:33:45.621703       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:33:45.621934       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	
	
	==> kube-apiserver [ffec6106be6c] <==
	I0831 22:29:42.351464       1 controller.go:615] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0831 22:29:42.447047       1 alloc.go:330] "allocated clusterIPs" service="default/kubernetes" clusterIPs={"IPv4":"10.96.0.1"}
	W0831 22:29:42.450860       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.169.0.5]
	I0831 22:29:42.451599       1 controller.go:615] quota admission added evaluator for: endpoints
	I0831 22:29:42.454145       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0831 22:29:43.117776       1 controller.go:615] quota admission added evaluator for: serviceaccounts
	I0831 22:29:44.628868       1 controller.go:615] quota admission added evaluator for: deployments.apps
	I0831 22:29:44.643482       1 alloc.go:330] "allocated clusterIPs" service="kube-system/kube-dns" clusterIPs={"IPv4":"10.96.0.10"}
	I0831 22:29:44.649286       1 controller.go:615] quota admission added evaluator for: daemonsets.apps
	I0831 22:29:48.568363       1 controller.go:615] quota admission added evaluator for: replicasets.apps
	I0831 22:29:48.768446       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	E0831 22:32:24.583976       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51190: use of closed network connection
	E0831 22:32:24.787019       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51192: use of closed network connection
	E0831 22:32:24.994355       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51194: use of closed network connection
	E0831 22:32:25.183977       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51196: use of closed network connection
	E0831 22:32:25.381277       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51198: use of closed network connection
	E0831 22:32:25.569952       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51200: use of closed network connection
	E0831 22:32:25.763008       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51202: use of closed network connection
	E0831 22:32:25.965367       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51204: use of closed network connection
	E0831 22:32:26.154701       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51206: use of closed network connection
	E0831 22:32:26.694309       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51211: use of closed network connection
	E0831 22:32:26.880399       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51213: use of closed network connection
	E0831 22:32:27.077320       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51215: use of closed network connection
	E0831 22:32:27.267610       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51217: use of closed network connection
	E0831 22:32:27.476005       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51219: use of closed network connection
	
	
	==> kube-controller-manager [6670fd34164c] <==
	I0831 22:31:58.309145       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:31:58.363553       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:32:00.655864       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:32:13.090917       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:32:13.100697       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:32:13.164123       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:32:20.074086       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="91.437594ms"
	I0831 22:32:20.089117       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="14.696904ms"
	I0831 22:32:20.155832       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="66.417676ms"
	I0831 22:32:20.247938       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="91.617712ms"
	E0831 22:32:20.248480       1 replica_set.go:560] "Unhandled Error" err="sync \"default/busybox-7dff88458\" failed with Operation cannot be fulfilled on replicasets.apps \"busybox-7dff88458\": the object has been modified; please apply your changes to the latest version and try again" logger="UnhandledError"
	I0831 22:32:20.257744       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="7.890782ms"
	I0831 22:32:20.258053       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="29.491µs"
	I0831 22:32:20.352807       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="29.639µs"
	I0831 22:32:21.164054       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:32:21.310383       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="34.795µs"
	I0831 22:32:22.115926       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="5.066721ms"
	I0831 22:32:22.116004       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="26.449µs"
	I0831 22:32:23.502335       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="6.289855ms"
	I0831 22:32:23.502432       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="58.061µs"
	I0831 22:32:24.043757       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="4.626106ms"
	I0831 22:32:24.044703       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="46.785µs"
	I0831 22:32:44.005602       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m02"
	I0831 22:32:48.178405       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000"
	I0831 22:32:52.115444       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	
	
	==> kube-proxy [54d5f8041c89] <==
		add table ip kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	E0831 22:29:49.977338       1 proxier.go:734] "Error cleaning up nftables rules" err=<
		could not run nftables command: /dev/stdin:1:1-25: Error: Could not process rule: Operation not supported
		add table ip6 kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	I0831 22:29:49.983071       1 server.go:677] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	E0831 22:29:49.983430       1 server.go:234] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I0831 22:29:50.023032       1 server_linux.go:146] "No iptables support for family" ipFamily="IPv6"
	I0831 22:29:50.023054       1 server.go:245] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0831 22:29:50.023070       1 server_linux.go:169] "Using iptables Proxier"
	I0831 22:29:50.025790       1 proxier.go:255] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I0831 22:29:50.026014       1 server.go:483] "Version info" version="v1.31.0"
	I0831 22:29:50.026061       1 server.go:485] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:29:50.026844       1 config.go:197] "Starting service config controller"
	I0831 22:29:50.027602       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0831 22:29:50.027141       1 config.go:104] "Starting endpoint slice config controller"
	I0831 22:29:50.027698       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0831 22:29:50.027260       1 config.go:326] "Starting node config controller"
	I0831 22:29:50.027720       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0831 22:29:50.128122       1 shared_informer.go:320] Caches are synced for node config
	I0831 22:29:50.128144       1 shared_informer.go:320] Caches are synced for service config
	I0831 22:29:50.128162       1 shared_informer.go:320] Caches are synced for endpoint slice config
	
	
	==> kube-scheduler [02c10e4f765d] <==
	W0831 22:29:42.107023       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0831 22:29:42.107231       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicasets\" in API group \"apps\" at the cluster scope" logger="UnhandledError"
	W0831 22:29:42.111966       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	E0831 22:29:42.112045       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumes\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:29:42.116498       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	E0831 22:29:42.116539       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Namespace: failed to list *v1.Namespace: namespaces is forbidden: User \"system:kube-scheduler\" cannot list resource \"namespaces\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:29:42.129701       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0831 22:29:42.129741       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumeclaims\" in API group \"\" at the cluster scope" logger="UnhandledError"
	I0831 22:29:45.342252       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	E0831 22:31:50.464567       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kube-proxy-d45q5\": pod kube-proxy-d45q5 is already assigned to node \"ha-949000-m03\"" plugin="DefaultBinder" pod="kube-system/kube-proxy-d45q5" node="ha-949000-m03"
	E0831 22:31:50.464652       1 schedule_one.go:348] "scheduler cache ForgetPod failed" err="pod 9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2(kube-system/kube-proxy-d45q5) wasn't assumed so cannot be forgotten" pod="kube-system/kube-proxy-d45q5"
	E0831 22:31:50.464667       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kube-proxy-d45q5\": pod kube-proxy-d45q5 is already assigned to node \"ha-949000-m03\"" pod="kube-system/kube-proxy-d45q5"
	I0831 22:31:50.464683       1 schedule_one.go:1070] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kube-proxy-d45q5" node="ha-949000-m03"
	E0831 22:31:50.476710       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kindnet-l4zbh\": pod kindnet-l4zbh is already assigned to node \"ha-949000-m03\"" plugin="DefaultBinder" pod="kube-system/kindnet-l4zbh" node="ha-949000-m03"
	E0831 22:31:50.476756       1 schedule_one.go:348] "scheduler cache ForgetPod failed" err="pod c551bb18-9a7d-4fca-9724-be7900980a40(kube-system/kindnet-l4zbh) wasn't assumed so cannot be forgotten" pod="kube-system/kindnet-l4zbh"
	E0831 22:31:50.476767       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kindnet-l4zbh\": pod kindnet-l4zbh is already assigned to node \"ha-949000-m03\"" pod="kube-system/kindnet-l4zbh"
	I0831 22:31:50.476781       1 schedule_one.go:1070] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kindnet-l4zbh" node="ha-949000-m03"
	E0831 22:32:20.049491       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-6r9s5\": pod busybox-7dff88458-6r9s5 is already assigned to node \"ha-949000-m02\"" plugin="DefaultBinder" pod="default/busybox-7dff88458-6r9s5" node="ha-949000-m02"
	E0831 22:32:20.049618       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-6r9s5\": pod busybox-7dff88458-6r9s5 is already assigned to node \"ha-949000-m02\"" pod="default/busybox-7dff88458-6r9s5"
	E0831 22:32:20.071235       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-vjf9x\": pod busybox-7dff88458-vjf9x is already assigned to node \"ha-949000-m03\"" plugin="DefaultBinder" pod="default/busybox-7dff88458-vjf9x" node="ha-949000-m03"
	E0831 22:32:20.071466       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-vjf9x\": pod busybox-7dff88458-vjf9x is already assigned to node \"ha-949000-m03\"" pod="default/busybox-7dff88458-vjf9x"
	E0831 22:32:20.073498       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-5kkbw\": pod busybox-7dff88458-5kkbw is already assigned to node \"ha-949000\"" plugin="DefaultBinder" pod="default/busybox-7dff88458-5kkbw" node="ha-949000"
	E0831 22:32:20.073571       1 schedule_one.go:348] "scheduler cache ForgetPod failed" err="pod e97e21d8-a69e-451c-babd-6232e12aafe0(default/busybox-7dff88458-5kkbw) wasn't assumed so cannot be forgotten" pod="default/busybox-7dff88458-5kkbw"
	E0831 22:32:20.077323       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-5kkbw\": pod busybox-7dff88458-5kkbw is already assigned to node \"ha-949000\"" pod="default/busybox-7dff88458-5kkbw"
	I0831 22:32:20.077394       1 schedule_one.go:1070] "Pod has been assigned to node. Abort adding it back to queue." pod="default/busybox-7dff88458-5kkbw" node="ha-949000"
	
	
	==> kubelet <==
	Aug 31 22:30:08 ha-949000 kubelet[2157]: I0831 22:30:08.742452    2157 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-snq8s" podStartSLOduration=19.742440453 podStartE2EDuration="19.742440453s" podCreationTimestamp="2024-08-31 22:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-08-31 22:30:08.742201936 +0000 UTC m=+24.362226027" watchObservedRunningTime="2024-08-31 22:30:08.742440453 +0000 UTC m=+24.362464538"
	Aug 31 22:30:08 ha-949000 kubelet[2157]: I0831 22:30:08.742651    2157 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/storage-provisioner" podStartSLOduration=20.742642621999998 podStartE2EDuration="20.742642622s" podCreationTimestamp="2024-08-31 22:29:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-08-31 22:30:08.732189424 +0000 UTC m=+24.352213514" watchObservedRunningTime="2024-08-31 22:30:08.742642622 +0000 UTC m=+24.362666707"
	Aug 31 22:30:44 ha-949000 kubelet[2157]: E0831 22:30:44.495173    2157 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:30:44 ha-949000 kubelet[2157]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:30:44 ha-949000 kubelet[2157]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:30:44 ha-949000 kubelet[2157]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:30:44 ha-949000 kubelet[2157]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:31:44 ha-949000 kubelet[2157]: E0831 22:31:44.490275    2157 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:31:44 ha-949000 kubelet[2157]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:31:44 ha-949000 kubelet[2157]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:31:44 ha-949000 kubelet[2157]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:31:44 ha-949000 kubelet[2157]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:32:20 ha-949000 kubelet[2157]: W0831 22:32:20.081132    2157 reflector.go:561] object-"default"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ha-949000" cannot list resource "configmaps" in API group "" in the namespace "default": no relationship found between node 'ha-949000' and this object
	Aug 31 22:32:20 ha-949000 kubelet[2157]: E0831 22:32:20.081252    2157 reflector.go:158] "Unhandled Error" err="object-\"default\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ha-949000\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"default\": no relationship found between node 'ha-949000' and this object" logger="UnhandledError"
	Aug 31 22:32:20 ha-949000 kubelet[2157]: I0831 22:32:20.223174    2157 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l95k\" (UniqueName: \"kubernetes.io/projected/e97e21d8-a69e-451c-babd-6232e12aafe0-kube-api-access-6l95k\") pod \"busybox-7dff88458-5kkbw\" (UID: \"e97e21d8-a69e-451c-babd-6232e12aafe0\") " pod="default/busybox-7dff88458-5kkbw"
	Aug 31 22:32:44 ha-949000 kubelet[2157]: E0831 22:32:44.489812    2157 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:32:44 ha-949000 kubelet[2157]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:32:44 ha-949000 kubelet[2157]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:32:44 ha-949000 kubelet[2157]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:32:44 ha-949000 kubelet[2157]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:33:44 ha-949000 kubelet[2157]: E0831 22:33:44.492393    2157 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:33:44 ha-949000 kubelet[2157]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:33:44 ha-949000 kubelet[2157]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:33:44 ha-949000 kubelet[2157]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:33:44 ha-949000 kubelet[2157]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:255: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p ha-949000 -n ha-949000
helpers_test.go:262: (dbg) Run:  kubectl --context ha-949000 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:286: <<< TestMultiControlPlane/serial/CopyFile FAILED: end of post-mortem logs <<<
helpers_test.go:287: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/CopyFile (3.42s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StopSecondaryNode (11.66s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StopSecondaryNode
ha_test.go:363: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 node stop m02 -v=7 --alsologtostderr
ha_test.go:363: (dbg) Done: out/minikube-darwin-amd64 -p ha-949000 node stop m02 -v=7 --alsologtostderr: (8.343249218s)
ha_test.go:369: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr
ha_test.go:369: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr: exit status 7 (351.573607ms)

                                                
                                                
-- stdout --
	ha-949000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m02
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-949000-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 15:33:59.206259    3487 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:33:59.206539    3487 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:33:59.206544    3487 out.go:358] Setting ErrFile to fd 2...
	I0831 15:33:59.206548    3487 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:33:59.206718    3487 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:33:59.206897    3487 out.go:352] Setting JSON to false
	I0831 15:33:59.206920    3487 mustload.go:65] Loading cluster: ha-949000
	I0831 15:33:59.206961    3487 notify.go:220] Checking for updates...
	I0831 15:33:59.207250    3487 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:33:59.207265    3487 status.go:255] checking status of ha-949000 ...
	I0831 15:33:59.207610    3487 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:33:59.207655    3487 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:33:59.216494    3487 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51361
	I0831 15:33:59.216923    3487 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:33:59.217330    3487 main.go:141] libmachine: Using API Version  1
	I0831 15:33:59.217340    3487 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:33:59.217548    3487 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:33:59.217649    3487 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:33:59.217736    3487 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:33:59.217801    3487 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:33:59.218747    3487 status.go:330] ha-949000 host status = "Running" (err=<nil>)
	I0831 15:33:59.218769    3487 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:33:59.219011    3487 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:33:59.219030    3487 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:33:59.227380    3487 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51363
	I0831 15:33:59.227712    3487 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:33:59.228043    3487 main.go:141] libmachine: Using API Version  1
	I0831 15:33:59.228057    3487 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:33:59.228257    3487 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:33:59.228374    3487 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:33:59.228459    3487 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:33:59.228698    3487 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:33:59.228718    3487 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:33:59.237064    3487 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51365
	I0831 15:33:59.237372    3487 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:33:59.237692    3487 main.go:141] libmachine: Using API Version  1
	I0831 15:33:59.237700    3487 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:33:59.237891    3487 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:33:59.237998    3487 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:33:59.238122    3487 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:33:59.238145    3487 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:33:59.238230    3487 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:33:59.238326    3487 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:33:59.238424    3487 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:33:59.238505    3487 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:33:59.277313    3487 ssh_runner.go:195] Run: systemctl --version
	I0831 15:33:59.281979    3487 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:33:59.293267    3487 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:33:59.293291    3487 api_server.go:166] Checking apiserver status ...
	I0831 15:33:59.293331    3487 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:33:59.304706    3487 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup
	W0831 15:33:59.312018    3487 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:33:59.312061    3487 ssh_runner.go:195] Run: ls
	I0831 15:33:59.315460    3487 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:33:59.318664    3487 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:33:59.318675    3487 status.go:422] ha-949000 apiserver status = Running (err=<nil>)
	I0831 15:33:59.318700    3487 status.go:257] ha-949000 status: &{Name:ha-949000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:33:59.318713    3487 status.go:255] checking status of ha-949000-m02 ...
	I0831 15:33:59.318973    3487 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:33:59.319001    3487 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:33:59.327521    3487 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51369
	I0831 15:33:59.327854    3487 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:33:59.328169    3487 main.go:141] libmachine: Using API Version  1
	I0831 15:33:59.328180    3487 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:33:59.328387    3487 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:33:59.328498    3487 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:33:59.328584    3487 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:33:59.328652    3487 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:33:59.329580    3487 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid 2899 missing from process table
	I0831 15:33:59.329603    3487 status.go:330] ha-949000-m02 host status = "Stopped" (err=<nil>)
	I0831 15:33:59.329611    3487 status.go:343] host is not running, skipping remaining checks
	I0831 15:33:59.329617    3487 status.go:257] ha-949000-m02 status: &{Name:ha-949000-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:33:59.329627    3487 status.go:255] checking status of ha-949000-m03 ...
	I0831 15:33:59.329859    3487 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:33:59.329881    3487 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:33:59.338608    3487 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51371
	I0831 15:33:59.338966    3487 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:33:59.339283    3487 main.go:141] libmachine: Using API Version  1
	I0831 15:33:59.339302    3487 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:33:59.339516    3487 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:33:59.339618    3487 main.go:141] libmachine: (ha-949000-m03) Calling .GetState
	I0831 15:33:59.339699    3487 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:33:59.339772    3487 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:33:59.340737    3487 status.go:330] ha-949000-m03 host status = "Running" (err=<nil>)
	I0831 15:33:59.340748    3487 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:33:59.340995    3487 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:33:59.341025    3487 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:33:59.349718    3487 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51373
	I0831 15:33:59.350078    3487 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:33:59.350420    3487 main.go:141] libmachine: Using API Version  1
	I0831 15:33:59.350434    3487 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:33:59.350621    3487 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:33:59.350733    3487 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:33:59.350812    3487 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:33:59.351083    3487 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:33:59.351105    3487 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:33:59.359488    3487 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51375
	I0831 15:33:59.359823    3487 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:33:59.360167    3487 main.go:141] libmachine: Using API Version  1
	I0831 15:33:59.360181    3487 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:33:59.360380    3487 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:33:59.360494    3487 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:33:59.360616    3487 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:33:59.360626    3487 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:33:59.360710    3487 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:33:59.360798    3487 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:33:59.360881    3487 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:33:59.360958    3487 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:33:59.388609    3487 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:33:59.399705    3487 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:33:59.399720    3487 api_server.go:166] Checking apiserver status ...
	I0831 15:33:59.399755    3487 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:33:59.410487    3487 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1944/cgroup
	W0831 15:33:59.417419    3487 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1944/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:33:59.417459    3487 ssh_runner.go:195] Run: ls
	I0831 15:33:59.420886    3487 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:33:59.424106    3487 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:33:59.424118    3487 status.go:422] ha-949000-m03 apiserver status = Running (err=<nil>)
	I0831 15:33:59.424128    3487 status.go:257] ha-949000-m03 status: &{Name:ha-949000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:33:59.424139    3487 status.go:255] checking status of ha-949000-m04 ...
	I0831 15:33:59.424386    3487 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:33:59.424407    3487 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:33:59.432921    3487 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51379
	I0831 15:33:59.433266    3487 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:33:59.433594    3487 main.go:141] libmachine: Using API Version  1
	I0831 15:33:59.433603    3487 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:33:59.433823    3487 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:33:59.433934    3487 main.go:141] libmachine: (ha-949000-m04) Calling .GetState
	I0831 15:33:59.434020    3487 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:33:59.434087    3487 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3377
	I0831 15:33:59.435026    3487 status.go:330] ha-949000-m04 host status = "Running" (err=<nil>)
	I0831 15:33:59.435036    3487 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:33:59.435277    3487 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:33:59.435296    3487 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:33:59.443631    3487 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51381
	I0831 15:33:59.443948    3487 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:33:59.444303    3487 main.go:141] libmachine: Using API Version  1
	I0831 15:33:59.444320    3487 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:33:59.444532    3487 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:33:59.444643    3487 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:33:59.444727    3487 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:33:59.444968    3487 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:33:59.444996    3487 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:33:59.453477    3487 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51383
	I0831 15:33:59.453790    3487 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:33:59.454132    3487 main.go:141] libmachine: Using API Version  1
	I0831 15:33:59.454148    3487 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:33:59.454377    3487 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:33:59.454481    3487 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:33:59.454611    3487 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:33:59.454624    3487 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:33:59.454735    3487 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:33:59.454800    3487 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:33:59.454878    3487 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:33:59.454954    3487 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:33:59.490565    3487 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:33:59.501856    3487 status.go:257] ha-949000-m04 status: &{Name:ha-949000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:381: status says not three kubelets are running: args "out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr": ha-949000
type: Control Plane
host: Running
kubelet: Running
apiserver: Running
kubeconfig: Configured

                                                
                                                
ha-949000-m02
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-949000-m03
type: Control Plane
host: Running
kubelet: Running
apiserver: Running
kubeconfig: Configured

                                                
                                                
ha-949000-m04
type: Worker
host: Running
kubelet: Stopped

                                                
                                                
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-949000 -n ha-949000
helpers_test.go:245: <<< TestMultiControlPlane/serial/StopSecondaryNode FAILED: start of post-mortem logs <<<
helpers_test.go:246: ======>  post-mortem[TestMultiControlPlane/serial/StopSecondaryNode]: minikube logs <======
helpers_test.go:248: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 logs -n 25
helpers_test.go:248: (dbg) Done: out/minikube-darwin-amd64 -p ha-949000 logs -n 25: (2.302590469s)
helpers_test.go:253: TestMultiControlPlane/serial/StopSecondaryNode logs: 
-- stdout --
	
	==> Audit <==
	|----------------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	|    Command     |                 Args                 |      Profile      |  User   | Version |     Start Time      |      End Time       |
	|----------------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	| update-context | functional-593000                    | functional-593000 | jenkins | v1.33.1 | 31 Aug 24 15:28 PDT | 31 Aug 24 15:28 PDT |
	|                | update-context                       |                   |         |         |                     |                     |
	|                | --alsologtostderr -v=2               |                   |         |         |                     |                     |
	| delete         | -p functional-593000                 | functional-593000 | jenkins | v1.33.1 | 31 Aug 24 15:29 PDT | 31 Aug 24 15:29 PDT |
	| start          | -p ha-949000 --wait=true             | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:29 PDT | 31 Aug 24 15:32 PDT |
	|                | --memory=2200 --ha                   |                   |         |         |                     |                     |
	|                | -v=7 --alsologtostderr               |                   |         |         |                     |                     |
	|                | --driver=hyperkit                    |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- apply -f             | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | ./testdata/ha/ha-pod-dns-test.yaml   |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- rollout status       | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | deployment/busybox                   |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- get pods -o          | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | jsonpath='{.items[*].status.podIP}'  |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- get pods -o          | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | jsonpath='{.items[*].metadata.name}' |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-5kkbw --           |                   |         |         |                     |                     |
	|                | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-6r9s5 --           |                   |         |         |                     |                     |
	|                | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-vjf9x --           |                   |         |         |                     |                     |
	|                | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-5kkbw --           |                   |         |         |                     |                     |
	|                | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-6r9s5 --           |                   |         |         |                     |                     |
	|                | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-vjf9x --           |                   |         |         |                     |                     |
	|                | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-5kkbw -- nslookup  |                   |         |         |                     |                     |
	|                | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-6r9s5 -- nslookup  |                   |         |         |                     |                     |
	|                | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-vjf9x -- nslookup  |                   |         |         |                     |                     |
	|                | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- get pods -o          | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | jsonpath='{.items[*].metadata.name}' |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-5kkbw              |                   |         |         |                     |                     |
	|                | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|                | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|                | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-5kkbw -- sh        |                   |         |         |                     |                     |
	|                | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-6r9s5              |                   |         |         |                     |                     |
	|                | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|                | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|                | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-6r9s5 -- sh        |                   |         |         |                     |                     |
	|                | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-vjf9x              |                   |         |         |                     |                     |
	|                | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|                | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|                | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl        | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|                | busybox-7dff88458-vjf9x -- sh        |                   |         |         |                     |                     |
	|                | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| node           | add -p ha-949000 -v=7                | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT |                     |
	|                | --alsologtostderr                    |                   |         |         |                     |                     |
	| node           | ha-949000 node stop m02 -v=7         | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:33 PDT | 31 Aug 24 15:33 PDT |
	|                | --alsologtostderr                    |                   |         |         |                     |                     |
	|----------------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/08/31 15:29:09
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0831 15:29:09.276641    2876 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:29:09.276909    2876 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:29:09.276915    2876 out.go:358] Setting ErrFile to fd 2...
	I0831 15:29:09.276919    2876 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:29:09.277077    2876 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:29:09.278657    2876 out.go:352] Setting JSON to false
	I0831 15:29:09.304076    2876 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":1720,"bootTime":1725141629,"procs":442,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.6.1","kernelVersion":"23.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0831 15:29:09.304206    2876 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0831 15:29:09.363205    2876 out.go:177] * [ha-949000] minikube v1.33.1 on Darwin 14.6.1
	I0831 15:29:09.404287    2876 notify.go:220] Checking for updates...
	I0831 15:29:09.428120    2876 out.go:177]   - MINIKUBE_LOCATION=18943
	I0831 15:29:09.489040    2876 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:29:09.566857    2876 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0831 15:29:09.611464    2876 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0831 15:29:09.632356    2876 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:29:09.653358    2876 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0831 15:29:09.674652    2876 driver.go:392] Setting default libvirt URI to qemu:///system
	I0831 15:29:09.704277    2876 out.go:177] * Using the hyperkit driver based on user configuration
	I0831 15:29:09.746520    2876 start.go:297] selected driver: hyperkit
	I0831 15:29:09.746549    2876 start.go:901] validating driver "hyperkit" against <nil>
	I0831 15:29:09.746572    2876 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0831 15:29:09.750947    2876 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:29:09.751059    2876 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18943-957/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0831 15:29:09.759462    2876 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0831 15:29:09.763334    2876 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:09.763355    2876 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0831 15:29:09.763386    2876 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0831 15:29:09.763603    2876 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:29:09.763661    2876 cni.go:84] Creating CNI manager for ""
	I0831 15:29:09.763670    2876 cni.go:136] multinode detected (0 nodes found), recommending kindnet
	I0831 15:29:09.763676    2876 start_flags.go:319] Found "CNI" CNI - setting NetworkPlugin=cni
	I0831 15:29:09.763757    2876 start.go:340] cluster config:
	{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docke
r CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0
GPUs: AutoPauseInterval:1m0s}
	I0831 15:29:09.763847    2876 iso.go:125] acquiring lock: {Name:mk6e91575b208577856769ef01f8e000bc57c787 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:29:09.806188    2876 out.go:177] * Starting "ha-949000" primary control-plane node in "ha-949000" cluster
	I0831 15:29:09.827330    2876 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:29:09.827400    2876 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4
	I0831 15:29:09.827429    2876 cache.go:56] Caching tarball of preloaded images
	I0831 15:29:09.827640    2876 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:29:09.827663    2876 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:29:09.828200    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:29:09.828242    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json: {Name:mka3af2c42dba1cbf0f487cd55ddf735793024ce Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:09.828849    2876 start.go:360] acquireMachinesLock for ha-949000: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:29:09.828952    2876 start.go:364] duration metric: took 84.577µs to acquireMachinesLock for "ha-949000"
	I0831 15:29:09.828988    2876 start.go:93] Provisioning new machine with config: &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType
:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:29:09.829059    2876 start.go:125] createHost starting for "" (driver="hyperkit")
	I0831 15:29:09.903354    2876 out.go:235] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0831 15:29:09.903628    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:09.903698    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:09.913643    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51029
	I0831 15:29:09.913991    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:09.914387    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:09.914395    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:09.914636    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:09.914768    2876 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:29:09.914873    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:09.915000    2876 start.go:159] libmachine.API.Create for "ha-949000" (driver="hyperkit")
	I0831 15:29:09.915023    2876 client.go:168] LocalClient.Create starting
	I0831 15:29:09.915061    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem
	I0831 15:29:09.915112    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:29:09.915129    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:29:09.915188    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem
	I0831 15:29:09.915229    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:29:09.915249    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:29:09.915265    2876 main.go:141] libmachine: Running pre-create checks...
	I0831 15:29:09.915270    2876 main.go:141] libmachine: (ha-949000) Calling .PreCreateCheck
	I0831 15:29:09.915359    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:09.915528    2876 main.go:141] libmachine: (ha-949000) Calling .GetConfigRaw
	I0831 15:29:09.915949    2876 main.go:141] libmachine: Creating machine...
	I0831 15:29:09.915958    2876 main.go:141] libmachine: (ha-949000) Calling .Create
	I0831 15:29:09.916028    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:09.916144    2876 main.go:141] libmachine: (ha-949000) DBG | I0831 15:29:09.916024    2884 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:29:09.916224    2876 main.go:141] libmachine: (ha-949000) Downloading /Users/jenkins/minikube-integration/18943-957/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso...
	I0831 15:29:10.099863    2876 main.go:141] libmachine: (ha-949000) DBG | I0831 15:29:10.099790    2884 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa...
	I0831 15:29:10.256390    2876 main.go:141] libmachine: (ha-949000) DBG | I0831 15:29:10.256317    2884 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk...
	I0831 15:29:10.256437    2876 main.go:141] libmachine: (ha-949000) DBG | Writing magic tar header
	I0831 15:29:10.256445    2876 main.go:141] libmachine: (ha-949000) DBG | Writing SSH key tar header
	I0831 15:29:10.257253    2876 main.go:141] libmachine: (ha-949000) DBG | I0831 15:29:10.257126    2884 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000 ...
	I0831 15:29:10.614937    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:10.614967    2876 main.go:141] libmachine: (ha-949000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid
	I0831 15:29:10.615070    2876 main.go:141] libmachine: (ha-949000) DBG | Using UUID 98cab9ba-901d-49d1-9e6c-321a4533d56e
	I0831 15:29:10.724629    2876 main.go:141] libmachine: (ha-949000) DBG | Generated MAC ce:8:77:f7:42:5e
	I0831 15:29:10.724653    2876 main.go:141] libmachine: (ha-949000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:29:10.724744    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"98cab9ba-901d-49d1-9e6c-321a4533d56e", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001ae630)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:29:10.724785    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"98cab9ba-901d-49d1-9e6c-321a4533d56e", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001ae630)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:29:10.724823    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "98cab9ba-901d-49d1-9e6c-321a4533d56e", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd,earlyprintk=serial l
oglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:29:10.724851    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 98cab9ba-901d-49d1-9e6c-321a4533d56e -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset noresto
re waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:29:10.724862    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:29:10.727687    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: Pid is 2887
	I0831 15:29:10.728136    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 0
	I0831 15:29:10.728145    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:10.728201    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:10.729180    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:10.729276    2876 main.go:141] libmachine: (ha-949000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0831 15:29:10.729293    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:10.729309    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:10.729317    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:10.735289    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:29:10.788351    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:29:10.788955    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:29:10.788972    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:29:10.788980    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:29:10.788989    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:29:11.164652    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:29:11.164668    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:29:11.279214    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:29:11.279233    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:29:11.279245    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:29:11.279263    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:29:11.280165    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:29:11.280176    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:29:12.729552    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 1
	I0831 15:29:12.729568    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:12.729694    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:12.730495    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:12.730552    2876 main.go:141] libmachine: (ha-949000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0831 15:29:12.730566    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:12.730580    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:12.730595    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:14.731472    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 2
	I0831 15:29:14.731486    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:14.731548    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:14.732412    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:14.732458    2876 main.go:141] libmachine: (ha-949000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0831 15:29:14.732473    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:14.732492    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:14.732506    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:16.732786    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 3
	I0831 15:29:16.732802    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:16.732855    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:16.733685    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:16.733713    2876 main.go:141] libmachine: (ha-949000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0831 15:29:16.733721    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:16.733748    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:16.733759    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:16.839902    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:16 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:29:16.839946    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:16 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:29:16.839959    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:16 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:29:16.864989    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:16 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:29:18.735154    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 4
	I0831 15:29:18.735170    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:18.735286    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:18.736038    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:18.736084    2876 main.go:141] libmachine: (ha-949000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0831 15:29:18.736094    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:18.736103    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:18.736112    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:20.736683    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 5
	I0831 15:29:20.736698    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:20.736791    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:20.737588    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:20.737620    2876 main.go:141] libmachine: (ha-949000) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:20.737633    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:20.737640    2876 main.go:141] libmachine: (ha-949000) DBG | Found match: ce:8:77:f7:42:5e
	I0831 15:29:20.737645    2876 main.go:141] libmachine: (ha-949000) DBG | IP: 192.169.0.5
	I0831 15:29:20.737694    2876 main.go:141] libmachine: (ha-949000) Calling .GetConfigRaw
	I0831 15:29:20.738300    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:20.738400    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:20.738493    2876 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0831 15:29:20.738503    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:29:20.738582    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:20.738639    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:20.739400    2876 main.go:141] libmachine: Detecting operating system of created instance...
	I0831 15:29:20.739409    2876 main.go:141] libmachine: Waiting for SSH to be available...
	I0831 15:29:20.739415    2876 main.go:141] libmachine: Getting to WaitForSSH function...
	I0831 15:29:20.739420    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:20.739500    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:20.739608    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:20.739694    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:20.739784    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:20.739906    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:20.740082    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:20.740088    2876 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0831 15:29:21.810169    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:29:21.810183    2876 main.go:141] libmachine: Detecting the provisioner...
	I0831 15:29:21.810190    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:21.810319    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:21.810409    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.810520    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.810622    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:21.810753    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:21.810899    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:21.810907    2876 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0831 15:29:21.876064    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0831 15:29:21.876103    2876 main.go:141] libmachine: found compatible host: buildroot
	I0831 15:29:21.876110    2876 main.go:141] libmachine: Provisioning with buildroot...
	I0831 15:29:21.876116    2876 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:29:21.876252    2876 buildroot.go:166] provisioning hostname "ha-949000"
	I0831 15:29:21.876263    2876 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:29:21.876353    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:21.876438    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:21.876542    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.876625    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.876705    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:21.876835    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:21.876977    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:21.876986    2876 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000 && echo "ha-949000" | sudo tee /etc/hostname
	I0831 15:29:21.955731    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000
	
	I0831 15:29:21.955752    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:21.955889    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:21.955998    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.956098    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.956196    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:21.956332    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:21.956482    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:21.956494    2876 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:29:22.031652    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:29:22.031674    2876 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:29:22.031695    2876 buildroot.go:174] setting up certificates
	I0831 15:29:22.031704    2876 provision.go:84] configureAuth start
	I0831 15:29:22.031711    2876 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:29:22.031840    2876 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:29:22.031922    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:22.032006    2876 provision.go:143] copyHostCerts
	I0831 15:29:22.032046    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:29:22.032109    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:29:22.032118    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:29:22.032257    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:29:22.032465    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:29:22.032502    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:29:22.032507    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:29:22.032592    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:29:22.032752    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:29:22.032790    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:29:22.032795    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:29:22.032874    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:29:22.033015    2876 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000 san=[127.0.0.1 192.169.0.5 ha-949000 localhost minikube]
	I0831 15:29:22.113278    2876 provision.go:177] copyRemoteCerts
	I0831 15:29:22.113334    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:29:22.113349    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:22.113477    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:22.113572    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.113653    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:22.113746    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:22.153055    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:29:22.153132    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:29:22.173186    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:29:22.173254    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1196 bytes)
	I0831 15:29:22.192526    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:29:22.192581    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0831 15:29:22.212150    2876 provision.go:87] duration metric: took 180.428736ms to configureAuth
	I0831 15:29:22.212163    2876 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:29:22.212301    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:29:22.212314    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:22.212441    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:22.212522    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:22.212600    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.212680    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.212760    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:22.212882    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:22.213008    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:22.213015    2876 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:29:22.281023    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:29:22.281035    2876 buildroot.go:70] root file system type: tmpfs
	I0831 15:29:22.281108    2876 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:29:22.281121    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:22.281265    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:22.281355    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.281474    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.281559    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:22.281695    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:22.281836    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:22.281881    2876 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:29:22.358523    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:29:22.358550    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:22.358687    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:22.358785    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.358873    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.358967    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:22.359137    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:22.359281    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:22.359293    2876 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:29:23.900860    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:29:23.900883    2876 main.go:141] libmachine: Checking connection to Docker...
	I0831 15:29:23.900890    2876 main.go:141] libmachine: (ha-949000) Calling .GetURL
	I0831 15:29:23.901027    2876 main.go:141] libmachine: Docker is up and running!
	I0831 15:29:23.901035    2876 main.go:141] libmachine: Reticulating splines...
	I0831 15:29:23.901040    2876 client.go:171] duration metric: took 13.985813631s to LocalClient.Create
	I0831 15:29:23.901051    2876 start.go:167] duration metric: took 13.985855387s to libmachine.API.Create "ha-949000"
	I0831 15:29:23.901061    2876 start.go:293] postStartSetup for "ha-949000" (driver="hyperkit")
	I0831 15:29:23.901070    2876 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:29:23.901080    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:23.901239    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:29:23.901251    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:23.901337    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:23.901438    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:23.901525    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:23.901622    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:23.947237    2876 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:29:23.951946    2876 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:29:23.951965    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:29:23.952069    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:29:23.952248    2876 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:29:23.952255    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:29:23.952462    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:29:23.961814    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:29:23.990864    2876 start.go:296] duration metric: took 89.791408ms for postStartSetup
	I0831 15:29:23.990895    2876 main.go:141] libmachine: (ha-949000) Calling .GetConfigRaw
	I0831 15:29:23.991499    2876 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:29:23.991642    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:29:23.991961    2876 start.go:128] duration metric: took 14.162686523s to createHost
	I0831 15:29:23.991974    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:23.992084    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:23.992175    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:23.992259    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:23.992348    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:23.992457    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:23.992584    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:23.992591    2876 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:29:24.059500    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143363.867477750
	
	I0831 15:29:24.059512    2876 fix.go:216] guest clock: 1725143363.867477750
	I0831 15:29:24.059517    2876 fix.go:229] Guest: 2024-08-31 15:29:23.86747775 -0700 PDT Remote: 2024-08-31 15:29:23.991969 -0700 PDT m=+14.752935961 (delta=-124.49125ms)
	I0831 15:29:24.059536    2876 fix.go:200] guest clock delta is within tolerance: -124.49125ms
	I0831 15:29:24.059546    2876 start.go:83] releasing machines lock for "ha-949000", held for 14.230377343s
	I0831 15:29:24.059565    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:24.059706    2876 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:29:24.059819    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:24.060132    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:24.060244    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:24.060319    2876 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:29:24.060346    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:24.060384    2876 ssh_runner.go:195] Run: cat /version.json
	I0831 15:29:24.060396    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:24.060439    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:24.060498    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:24.060525    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:24.060623    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:24.060654    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:24.060746    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:24.060765    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:24.060837    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:24.096035    2876 ssh_runner.go:195] Run: systemctl --version
	I0831 15:29:24.148302    2876 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0831 15:29:24.153275    2876 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:29:24.153315    2876 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:29:24.165840    2876 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:29:24.165854    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:29:24.165972    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:29:24.181258    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:29:24.191149    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:29:24.200150    2876 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:29:24.200197    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:29:24.209198    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:29:24.217930    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:29:24.227002    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:29:24.237048    2876 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:29:24.246383    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:29:24.255322    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:29:24.264369    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:29:24.273487    2876 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:29:24.282138    2876 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:29:24.290220    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:24.385700    2876 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:29:24.407032    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:29:24.407111    2876 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:29:24.421439    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:29:24.437414    2876 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:29:24.451401    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:29:24.463382    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:29:24.474406    2876 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:29:24.507277    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:29:24.517707    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:29:24.532548    2876 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:29:24.535464    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:29:24.542699    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:29:24.557395    2876 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:29:24.662440    2876 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:29:24.769422    2876 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:29:24.769500    2876 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:29:24.784888    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:24.881202    2876 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:29:27.276172    2876 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.394917578s)
	I0831 15:29:27.276233    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:29:27.287739    2876 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:29:27.301676    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:29:27.312754    2876 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:29:27.407771    2876 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:29:27.503429    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:27.614933    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:29:27.628621    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:29:27.641141    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:27.759998    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:29:27.816359    2876 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:29:27.816437    2876 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:29:27.820881    2876 start.go:563] Will wait 60s for crictl version
	I0831 15:29:27.820929    2876 ssh_runner.go:195] Run: which crictl
	I0831 15:29:27.824109    2876 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:29:27.852863    2876 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:29:27.852937    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:29:27.870865    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:29:27.937728    2876 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:29:27.937791    2876 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:29:27.938219    2876 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:29:27.943196    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:29:27.954353    2876 kubeadm.go:883] updating cluster {Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.
0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 Moun
tType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0831 15:29:27.954419    2876 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:29:27.954480    2876 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 15:29:27.967028    2876 docker.go:685] Got preloaded images: 
	I0831 15:29:27.967040    2876 docker.go:691] registry.k8s.io/kube-apiserver:v1.31.0 wasn't preloaded
	I0831 15:29:27.967094    2876 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0831 15:29:27.975409    2876 ssh_runner.go:195] Run: which lz4
	I0831 15:29:27.978323    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 -> /preloaded.tar.lz4
	I0831 15:29:27.978434    2876 ssh_runner.go:195] Run: stat -c "%s %y" /preloaded.tar.lz4
	I0831 15:29:27.981530    2876 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%s %y" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0831 15:29:27.981546    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (342554258 bytes)
	I0831 15:29:28.829399    2876 docker.go:649] duration metric: took 850.988233ms to copy over tarball
	I0831 15:29:28.829466    2876 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I0831 15:29:31.094292    2876 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.264775779s)
	I0831 15:29:31.094306    2876 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0831 15:29:31.120523    2876 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0831 15:29:31.129444    2876 ssh_runner.go:362] scp memory --> /var/lib/docker/image/overlay2/repositories.json (2631 bytes)
	I0831 15:29:31.144462    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:31.255144    2876 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:29:33.625508    2876 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.370311255s)
	I0831 15:29:33.625595    2876 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 15:29:33.642024    2876 docker.go:685] Got preloaded images: -- stdout --
	registry.k8s.io/kube-controller-manager:v1.31.0
	registry.k8s.io/kube-scheduler:v1.31.0
	registry.k8s.io/kube-apiserver:v1.31.0
	registry.k8s.io/kube-proxy:v1.31.0
	registry.k8s.io/etcd:3.5.15-0
	registry.k8s.io/pause:3.10
	registry.k8s.io/coredns/coredns:v1.11.1
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0831 15:29:33.642043    2876 cache_images.go:84] Images are preloaded, skipping loading
	I0831 15:29:33.642059    2876 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.31.0 docker true true} ...
	I0831 15:29:33.642140    2876 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:29:33.642205    2876 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0831 15:29:33.687213    2876 cni.go:84] Creating CNI manager for ""
	I0831 15:29:33.687227    2876 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0831 15:29:33.687238    2876 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0831 15:29:33.687253    2876 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.31.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-949000 NodeName:ha-949000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0831 15:29:33.687355    2876 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-949000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.31.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0831 15:29:33.687380    2876 kube-vip.go:115] generating kube-vip config ...
	I0831 15:29:33.687436    2876 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:29:33.701609    2876 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:29:33.701679    2876 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/super-admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:29:33.701731    2876 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:29:33.709907    2876 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:29:33.709972    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0831 15:29:33.717287    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0831 15:29:33.730443    2876 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:29:33.743765    2876 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0831 15:29:33.758082    2876 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1446 bytes)
	I0831 15:29:33.771561    2876 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:29:33.774412    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:29:33.783869    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:33.875944    2876 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:29:33.891425    2876 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.5
	I0831 15:29:33.891438    2876 certs.go:194] generating shared ca certs ...
	I0831 15:29:33.891448    2876 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:33.891633    2876 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:29:33.891710    2876 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:29:33.891723    2876 certs.go:256] generating profile certs ...
	I0831 15:29:33.891775    2876 certs.go:363] generating signed profile cert for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:29:33.891786    2876 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt with IP's: []
	I0831 15:29:34.044423    2876 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt ...
	I0831 15:29:34.044439    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt: {Name:mkff87193f625d157d1a4f89b0da256c90604083 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.044784    2876 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key ...
	I0831 15:29:34.044793    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key: {Name:mke1833d9b208b07a8ff6dd57d320eb167de83a3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.045031    2876 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.72b12f93
	I0831 15:29:34.045046    2876 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.72b12f93 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.254]
	I0831 15:29:34.207099    2876 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.72b12f93 ...
	I0831 15:29:34.207118    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.72b12f93: {Name:mk38f2742462440beada92d4e254471d0fe85db9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.207433    2876 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.72b12f93 ...
	I0831 15:29:34.207443    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.72b12f93: {Name:mk29a130e2c97d3f060f247819d7c01c723a8502 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.207661    2876 certs.go:381] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.72b12f93 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt
	I0831 15:29:34.207842    2876 certs.go:385] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.72b12f93 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key
	I0831 15:29:34.208036    2876 certs.go:363] generating signed profile cert for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:29:34.208050    2876 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt with IP's: []
	I0831 15:29:34.314095    2876 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt ...
	I0831 15:29:34.314111    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt: {Name:mk708e4939e774d52c9a7d3335e0202d13493538 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.314481    2876 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key ...
	I0831 15:29:34.314489    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key: {Name:mkcfbb0611781f7e5640984b0a9cc91976dc5482 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.314700    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:29:34.314732    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:29:34.314751    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:29:34.314769    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:29:34.314787    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:29:34.314811    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:29:34.314831    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:29:34.314850    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:29:34.314947    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:29:34.314997    2876 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:29:34.315005    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:29:34.315034    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:29:34.315062    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:29:34.315091    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:29:34.315155    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:29:34.315187    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:29:34.315211    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:29:34.315229    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:29:34.315668    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:29:34.335288    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:29:34.355233    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:29:34.374357    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:29:34.393538    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I0831 15:29:34.413840    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0831 15:29:34.433106    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:29:34.452816    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:29:34.472204    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:29:34.492102    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:29:34.512126    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:29:34.530945    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0831 15:29:34.546877    2876 ssh_runner.go:195] Run: openssl version
	I0831 15:29:34.551681    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:29:34.565047    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:29:34.568688    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:29:34.568737    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:29:34.573250    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:29:34.587250    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:29:34.595871    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:29:34.599208    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:29:34.599248    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:29:34.603521    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:29:34.611689    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:29:34.620193    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:29:34.624378    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:29:34.624428    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:29:34.628785    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:29:34.637154    2876 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:29:34.640263    2876 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 15:29:34.640305    2876 kubeadm.go:392] StartCluster: {Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 C
lusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountTy
pe:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:29:34.640393    2876 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0831 15:29:34.652254    2876 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0831 15:29:34.660013    2876 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0831 15:29:34.668312    2876 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0831 15:29:34.675860    2876 kubeadm.go:155] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0831 15:29:34.675868    2876 kubeadm.go:157] found existing configuration files:
	
	I0831 15:29:34.675907    2876 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0831 15:29:34.683169    2876 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I0831 15:29:34.683212    2876 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I0831 15:29:34.690543    2876 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0831 15:29:34.697493    2876 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0831 15:29:34.697539    2876 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I0831 15:29:34.704850    2876 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0831 15:29:34.712593    2876 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0831 15:29:34.712643    2876 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0831 15:29:34.720047    2876 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0831 15:29:34.727239    2876 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0831 15:29:34.727279    2876 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0831 15:29:34.734575    2876 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem"
	I0831 15:29:34.806234    2876 kubeadm.go:310] [init] Using Kubernetes version: v1.31.0
	I0831 15:29:34.806318    2876 kubeadm.go:310] [preflight] Running pre-flight checks
	I0831 15:29:34.880330    2876 kubeadm.go:310] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0831 15:29:34.880424    2876 kubeadm.go:310] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0831 15:29:34.880492    2876 kubeadm.go:310] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I0831 15:29:34.888288    2876 kubeadm.go:310] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0831 15:29:34.931799    2876 out.go:235]   - Generating certificates and keys ...
	I0831 15:29:34.931855    2876 kubeadm.go:310] [certs] Using existing ca certificate authority
	I0831 15:29:34.931917    2876 kubeadm.go:310] [certs] Using existing apiserver certificate and key on disk
	I0831 15:29:35.094247    2876 kubeadm.go:310] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0831 15:29:35.242021    2876 kubeadm.go:310] [certs] Generating "front-proxy-ca" certificate and key
	I0831 15:29:35.553368    2876 kubeadm.go:310] [certs] Generating "front-proxy-client" certificate and key
	I0831 15:29:35.874778    2876 kubeadm.go:310] [certs] Generating "etcd/ca" certificate and key
	I0831 15:29:36.045823    2876 kubeadm.go:310] [certs] Generating "etcd/server" certificate and key
	I0831 15:29:36.046072    2876 kubeadm.go:310] [certs] etcd/server serving cert is signed for DNS names [ha-949000 localhost] and IPs [192.169.0.5 127.0.0.1 ::1]
	I0831 15:29:36.253528    2876 kubeadm.go:310] [certs] Generating "etcd/peer" certificate and key
	I0831 15:29:36.253651    2876 kubeadm.go:310] [certs] etcd/peer serving cert is signed for DNS names [ha-949000 localhost] and IPs [192.169.0.5 127.0.0.1 ::1]
	I0831 15:29:36.362185    2876 kubeadm.go:310] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0831 15:29:36.481613    2876 kubeadm.go:310] [certs] Generating "apiserver-etcd-client" certificate and key
	I0831 15:29:36.595099    2876 kubeadm.go:310] [certs] Generating "sa" key and public key
	I0831 15:29:36.595231    2876 kubeadm.go:310] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0831 15:29:36.687364    2876 kubeadm.go:310] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0831 15:29:36.786350    2876 kubeadm.go:310] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I0831 15:29:36.838505    2876 kubeadm.go:310] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0831 15:29:37.183406    2876 kubeadm.go:310] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0831 15:29:37.330529    2876 kubeadm.go:310] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0831 15:29:37.331123    2876 kubeadm.go:310] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0831 15:29:37.332869    2876 kubeadm.go:310] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0831 15:29:37.354639    2876 out.go:235]   - Booting up control plane ...
	I0831 15:29:37.354715    2876 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0831 15:29:37.354798    2876 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0831 15:29:37.354856    2876 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0831 15:29:37.354940    2876 kubeadm.go:310] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0831 15:29:37.355015    2876 kubeadm.go:310] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0831 15:29:37.355046    2876 kubeadm.go:310] [kubelet-start] Starting the kubelet
	I0831 15:29:37.462381    2876 kubeadm.go:310] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I0831 15:29:37.462478    2876 kubeadm.go:310] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I0831 15:29:37.972217    2876 kubeadm.go:310] [kubelet-check] The kubelet is healthy after 510.286911ms
	I0831 15:29:37.972306    2876 kubeadm.go:310] [api-check] Waiting for a healthy API server. This can take up to 4m0s
	I0831 15:29:43.988604    2876 kubeadm.go:310] [api-check] The API server is healthy after 6.020603512s
	I0831 15:29:44.000520    2876 kubeadm.go:310] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0831 15:29:44.008573    2876 kubeadm.go:310] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0831 15:29:44.022134    2876 kubeadm.go:310] [upload-certs] Skipping phase. Please see --upload-certs
	I0831 15:29:44.022318    2876 kubeadm.go:310] [mark-control-plane] Marking the node ha-949000 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0831 15:29:44.029102    2876 kubeadm.go:310] [bootstrap-token] Using token: zw6kb9.o9r4potygin4i7x2
	I0831 15:29:44.050780    2876 out.go:235]   - Configuring RBAC rules ...
	I0831 15:29:44.050942    2876 kubeadm.go:310] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0831 15:29:44.094287    2876 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0831 15:29:44.099052    2876 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0831 15:29:44.101377    2876 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0831 15:29:44.103328    2876 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0831 15:29:44.105426    2876 kubeadm.go:310] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0831 15:29:44.395210    2876 kubeadm.go:310] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0831 15:29:44.821705    2876 kubeadm.go:310] [addons] Applied essential addon: CoreDNS
	I0831 15:29:45.395130    2876 kubeadm.go:310] [addons] Applied essential addon: kube-proxy
	I0831 15:29:45.396108    2876 kubeadm.go:310] 
	I0831 15:29:45.396158    2876 kubeadm.go:310] Your Kubernetes control-plane has initialized successfully!
	I0831 15:29:45.396163    2876 kubeadm.go:310] 
	I0831 15:29:45.396236    2876 kubeadm.go:310] To start using your cluster, you need to run the following as a regular user:
	I0831 15:29:45.396245    2876 kubeadm.go:310] 
	I0831 15:29:45.396264    2876 kubeadm.go:310]   mkdir -p $HOME/.kube
	I0831 15:29:45.396314    2876 kubeadm.go:310]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0831 15:29:45.396355    2876 kubeadm.go:310]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0831 15:29:45.396359    2876 kubeadm.go:310] 
	I0831 15:29:45.396397    2876 kubeadm.go:310] Alternatively, if you are the root user, you can run:
	I0831 15:29:45.396406    2876 kubeadm.go:310] 
	I0831 15:29:45.396453    2876 kubeadm.go:310]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0831 15:29:45.396458    2876 kubeadm.go:310] 
	I0831 15:29:45.396496    2876 kubeadm.go:310] You should now deploy a pod network to the cluster.
	I0831 15:29:45.396560    2876 kubeadm.go:310] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0831 15:29:45.396617    2876 kubeadm.go:310]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0831 15:29:45.396623    2876 kubeadm.go:310] 
	I0831 15:29:45.396691    2876 kubeadm.go:310] You can now join any number of control-plane nodes by copying certificate authorities
	I0831 15:29:45.396760    2876 kubeadm.go:310] and service account keys on each node and then running the following as root:
	I0831 15:29:45.396766    2876 kubeadm.go:310] 
	I0831 15:29:45.396839    2876 kubeadm.go:310]   kubeadm join control-plane.minikube.internal:8443 --token zw6kb9.o9r4potygin4i7x2 \
	I0831 15:29:45.396919    2876 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 \
	I0831 15:29:45.396939    2876 kubeadm.go:310] 	--control-plane 
	I0831 15:29:45.396943    2876 kubeadm.go:310] 
	I0831 15:29:45.397018    2876 kubeadm.go:310] Then you can join any number of worker nodes by running the following on each as root:
	I0831 15:29:45.397029    2876 kubeadm.go:310] 
	I0831 15:29:45.397093    2876 kubeadm.go:310] kubeadm join control-plane.minikube.internal:8443 --token zw6kb9.o9r4potygin4i7x2 \
	I0831 15:29:45.397173    2876 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 
	I0831 15:29:45.397526    2876 kubeadm.go:310] W0831 22:29:34.618825    1608 common.go:101] your configuration file uses a deprecated API spec: "kubeadm.k8s.io/v1beta3" (kind: "ClusterConfiguration"). Please use 'kubeadm config migrate --old-config old.yaml --new-config new.yaml', which will write the new, similar spec using a newer API version.
	I0831 15:29:45.397751    2876 kubeadm.go:310] W0831 22:29:34.619993    1608 common.go:101] your configuration file uses a deprecated API spec: "kubeadm.k8s.io/v1beta3" (kind: "InitConfiguration"). Please use 'kubeadm config migrate --old-config old.yaml --new-config new.yaml', which will write the new, similar spec using a newer API version.
	I0831 15:29:45.397847    2876 kubeadm.go:310] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0831 15:29:45.397857    2876 cni.go:84] Creating CNI manager for ""
	I0831 15:29:45.397874    2876 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0831 15:29:45.420531    2876 out.go:177] * Configuring CNI (Container Networking Interface) ...
	I0831 15:29:45.477445    2876 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I0831 15:29:45.482633    2876 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.31.0/kubectl ...
	I0831 15:29:45.482643    2876 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2601 bytes)
	I0831 15:29:45.498168    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I0831 15:29:45.749965    2876 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0831 15:29:45.750050    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-949000 minikube.k8s.io/updated_at=2024_08_31T15_29_45_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2 minikube.k8s.io/name=ha-949000 minikube.k8s.io/primary=true
	I0831 15:29:45.750061    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:45.882304    2876 ops.go:34] apiserver oom_adj: -16
	I0831 15:29:45.896818    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:46.398021    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:46.897815    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:47.397274    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:47.897049    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:48.397593    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:48.462357    2876 kubeadm.go:1113] duration metric: took 2.712335704s to wait for elevateKubeSystemPrivileges
	I0831 15:29:48.462374    2876 kubeadm.go:394] duration metric: took 13.821875392s to StartCluster
	I0831 15:29:48.462389    2876 settings.go:142] acquiring lock: {Name:mk4b1b0a7439feab82be8f6d66b4d3c4d11c9b5f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:48.462482    2876 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:29:48.462909    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/kubeconfig: {Name:mkc7259a3f17d77b84078e55eed4ed8b5d2486ca Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:48.463157    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0831 15:29:48.463168    2876 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:29:48.463181    2876 start.go:241] waiting for startup goroutines ...
	I0831 15:29:48.463194    2876 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0831 15:29:48.463223    2876 addons.go:69] Setting storage-provisioner=true in profile "ha-949000"
	I0831 15:29:48.463228    2876 addons.go:69] Setting default-storageclass=true in profile "ha-949000"
	I0831 15:29:48.463245    2876 addons.go:234] Setting addon storage-provisioner=true in "ha-949000"
	I0831 15:29:48.463250    2876 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "ha-949000"
	I0831 15:29:48.463260    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:29:48.463303    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:29:48.463512    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:48.463518    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:48.463528    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:48.463540    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:48.472681    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51052
	I0831 15:29:48.473013    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51054
	I0831 15:29:48.473095    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:48.473332    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:48.473451    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:48.473463    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:48.473652    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:48.473665    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:48.473689    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:48.473921    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:48.474101    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:29:48.474113    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:48.474145    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:48.474214    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:48.474299    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:48.476440    2876 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:29:48.476667    2876 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x48c7c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0831 15:29:48.477025    2876 cert_rotation.go:140] Starting client certificate rotation controller
	I0831 15:29:48.477197    2876 addons.go:234] Setting addon default-storageclass=true in "ha-949000"
	I0831 15:29:48.477218    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:29:48.477428    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:48.477442    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:48.483175    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51056
	I0831 15:29:48.483519    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:48.483886    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:48.483904    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:48.484146    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:48.484254    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:29:48.484334    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:48.484406    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:48.485343    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:48.485904    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51058
	I0831 15:29:48.486187    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:48.486486    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:48.486495    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:48.486696    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:48.487040    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:48.487078    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:48.495680    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51060
	I0831 15:29:48.496017    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:48.496360    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:48.496389    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:48.496611    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:48.496715    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:29:48.496791    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:48.496872    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:48.497794    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:48.497926    2876 addons.go:431] installing /etc/kubernetes/addons/storageclass.yaml
	I0831 15:29:48.497934    2876 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0831 15:29:48.497944    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:48.498021    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:48.498099    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:48.498200    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:48.498277    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:48.507200    2876 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0831 15:29:48.527696    2876 addons.go:431] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0831 15:29:48.527708    2876 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0831 15:29:48.527725    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:48.527878    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:48.527981    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:48.528082    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:48.528217    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:48.528370    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.169.0.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0831 15:29:48.564053    2876 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0831 15:29:48.586435    2876 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0831 15:29:48.827708    2876 start.go:971] {"host.minikube.internal": 192.169.0.1} host record injected into CoreDNS's ConfigMap
	I0831 15:29:48.827730    2876 main.go:141] libmachine: Making call to close driver server
	I0831 15:29:48.827739    2876 main.go:141] libmachine: (ha-949000) Calling .Close
	I0831 15:29:48.827907    2876 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:29:48.827916    2876 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:29:48.827922    2876 main.go:141] libmachine: Making call to close driver server
	I0831 15:29:48.827926    2876 main.go:141] libmachine: (ha-949000) Calling .Close
	I0831 15:29:48.828046    2876 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:29:48.828049    2876 main.go:141] libmachine: (ha-949000) DBG | Closing plugin on server side
	I0831 15:29:48.828058    2876 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:29:48.828113    2876 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I0831 15:29:48.828125    2876 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I0831 15:29:48.828210    2876 round_trippers.go:463] GET https://192.169.0.254:8443/apis/storage.k8s.io/v1/storageclasses
	I0831 15:29:48.828215    2876 round_trippers.go:469] Request Headers:
	I0831 15:29:48.828223    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:29:48.828227    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:29:48.833724    2876 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0831 15:29:48.834156    2876 round_trippers.go:463] PUT https://192.169.0.254:8443/apis/storage.k8s.io/v1/storageclasses/standard
	I0831 15:29:48.834163    2876 round_trippers.go:469] Request Headers:
	I0831 15:29:48.834169    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:29:48.834199    2876 round_trippers.go:473]     Content-Type: application/json
	I0831 15:29:48.834205    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:29:48.835718    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:29:48.835861    2876 main.go:141] libmachine: Making call to close driver server
	I0831 15:29:48.835876    2876 main.go:141] libmachine: (ha-949000) Calling .Close
	I0831 15:29:48.836028    2876 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:29:48.836037    2876 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:29:48.836048    2876 main.go:141] libmachine: (ha-949000) DBG | Closing plugin on server side
	I0831 15:29:49.019783    2876 main.go:141] libmachine: Making call to close driver server
	I0831 15:29:49.019796    2876 main.go:141] libmachine: (ha-949000) Calling .Close
	I0831 15:29:49.019979    2876 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:29:49.019989    2876 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:29:49.019994    2876 main.go:141] libmachine: Making call to close driver server
	I0831 15:29:49.019999    2876 main.go:141] libmachine: (ha-949000) Calling .Close
	I0831 15:29:49.019999    2876 main.go:141] libmachine: (ha-949000) DBG | Closing plugin on server side
	I0831 15:29:49.020151    2876 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:29:49.020153    2876 main.go:141] libmachine: (ha-949000) DBG | Closing plugin on server side
	I0831 15:29:49.020159    2876 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:29:49.059498    2876 out.go:177] * Enabled addons: default-storageclass, storage-provisioner
	I0831 15:29:49.117324    2876 addons.go:510] duration metric: took 654.121351ms for enable addons: enabled=[default-storageclass storage-provisioner]
	I0831 15:29:49.117374    2876 start.go:246] waiting for cluster config update ...
	I0831 15:29:49.117390    2876 start.go:255] writing updated cluster config ...
	I0831 15:29:49.155430    2876 out.go:201] 
	I0831 15:29:49.192527    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:29:49.192625    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:29:49.214378    2876 out.go:177] * Starting "ha-949000-m02" control-plane node in "ha-949000" cluster
	I0831 15:29:49.272137    2876 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:29:49.272171    2876 cache.go:56] Caching tarball of preloaded images
	I0831 15:29:49.272338    2876 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:29:49.272356    2876 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:29:49.272445    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:29:49.273113    2876 start.go:360] acquireMachinesLock for ha-949000-m02: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:29:49.273204    2876 start.go:364] duration metric: took 68.322µs to acquireMachinesLock for "ha-949000-m02"
	I0831 15:29:49.273234    2876 start.go:93] Provisioning new machine with config: &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks
:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m02 IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:29:49.273329    2876 start.go:125] createHost starting for "m02" (driver="hyperkit")
	I0831 15:29:49.296266    2876 out.go:235] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0831 15:29:49.296429    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:49.296488    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:49.306391    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51065
	I0831 15:29:49.306732    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:49.307039    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:49.307051    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:49.307254    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:49.307374    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:29:49.307457    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:29:49.307559    2876 start.go:159] libmachine.API.Create for "ha-949000" (driver="hyperkit")
	I0831 15:29:49.307576    2876 client.go:168] LocalClient.Create starting
	I0831 15:29:49.307604    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem
	I0831 15:29:49.307643    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:29:49.307655    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:29:49.307696    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem
	I0831 15:29:49.307726    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:29:49.307735    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:29:49.307749    2876 main.go:141] libmachine: Running pre-create checks...
	I0831 15:29:49.307754    2876 main.go:141] libmachine: (ha-949000-m02) Calling .PreCreateCheck
	I0831 15:29:49.307836    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:49.307906    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetConfigRaw
	I0831 15:29:49.333695    2876 main.go:141] libmachine: Creating machine...
	I0831 15:29:49.333716    2876 main.go:141] libmachine: (ha-949000-m02) Calling .Create
	I0831 15:29:49.333916    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:49.334092    2876 main.go:141] libmachine: (ha-949000-m02) DBG | I0831 15:29:49.333909    2898 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:29:49.334195    2876 main.go:141] libmachine: (ha-949000-m02) Downloading /Users/jenkins/minikube-integration/18943-957/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso...
	I0831 15:29:49.534537    2876 main.go:141] libmachine: (ha-949000-m02) DBG | I0831 15:29:49.534440    2898 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa...
	I0831 15:29:49.629999    2876 main.go:141] libmachine: (ha-949000-m02) DBG | I0831 15:29:49.629917    2898 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk...
	I0831 15:29:49.630021    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Writing magic tar header
	I0831 15:29:49.630031    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Writing SSH key tar header
	I0831 15:29:49.630578    2876 main.go:141] libmachine: (ha-949000-m02) DBG | I0831 15:29:49.630526    2898 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02 ...
	I0831 15:29:49.986563    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:49.986593    2876 main.go:141] libmachine: (ha-949000-m02) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid
	I0831 15:29:49.986663    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Using UUID 23e5d675-5201-4f3d-86b7-b25c818528d1
	I0831 15:29:50.021467    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Generated MAC 92:7:3c:3f:ee:b7
	I0831 15:29:50.021484    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:29:50.021548    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"23e5d675-5201-4f3d-86b7-b25c818528d1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:29:50.021582    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"23e5d675-5201-4f3d-86b7-b25c818528d1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:29:50.021623    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "23e5d675-5201-4f3d-86b7-b25c818528d1", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:29:50.021665    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 23e5d675-5201-4f3d-86b7-b25c818528d1 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:29:50.021684    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:29:50.024624    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: Pid is 2899
	I0831 15:29:50.025044    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 0
	I0831 15:29:50.025058    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:50.025119    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:29:50.026207    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:29:50.026276    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:50.026305    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:50.026350    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:50.026373    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:50.026416    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:50.032754    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:29:50.041001    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:29:50.041896    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:29:50.041918    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:29:50.041929    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:29:50.041946    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:29:50.432260    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:29:50.432276    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:29:50.547071    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:29:50.547090    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:29:50.547112    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:29:50.547127    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:29:50.547965    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:29:50.547973    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:29:52.027270    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 1
	I0831 15:29:52.027288    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:52.027415    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:29:52.028177    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:29:52.028225    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:52.028236    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:52.028247    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:52.028254    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:52.028263    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:54.029110    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 2
	I0831 15:29:54.029126    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:54.029231    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:29:54.029999    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:29:54.030057    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:54.030075    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:54.030087    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:54.030095    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:54.030103    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:56.031274    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 3
	I0831 15:29:56.031292    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:56.031369    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:29:56.032155    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:29:56.032168    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:56.032178    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:56.032196    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:56.032213    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:56.032224    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:56.132338    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:56 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:29:56.132386    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:56 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:29:56.132396    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:56 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:29:56.155372    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:56 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:29:58.032308    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 4
	I0831 15:29:58.032325    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:58.032424    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:29:58.033214    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:29:58.033247    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:58.033259    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:58.033269    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:58.033278    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:58.033287    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:30:00.033449    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 5
	I0831 15:30:00.033465    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:30:00.033544    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:30:00.034313    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:30:00.034404    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:30:00.034418    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:30:00.034426    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found match: 92:7:3c:3f:ee:b7
	I0831 15:30:00.034433    2876 main.go:141] libmachine: (ha-949000-m02) DBG | IP: 192.169.0.6
	I0831 15:30:00.034475    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetConfigRaw
	I0831 15:30:00.035147    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:00.035249    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:00.035348    2876 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0831 15:30:00.035357    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:30:00.035434    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:30:00.035493    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:30:00.036274    2876 main.go:141] libmachine: Detecting operating system of created instance...
	I0831 15:30:00.036284    2876 main.go:141] libmachine: Waiting for SSH to be available...
	I0831 15:30:00.036289    2876 main.go:141] libmachine: Getting to WaitForSSH function...
	I0831 15:30:00.036293    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:00.036398    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:00.036485    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:00.036575    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:00.036655    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:00.036771    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:00.036969    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:00.036976    2876 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0831 15:30:01.059248    2876 main.go:141] libmachine: Error dialing TCP: ssh: handshake failed: ssh: unable to authenticate, attempted methods [none publickey], no supported methods remain
	I0831 15:30:04.124333    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:30:04.124345    2876 main.go:141] libmachine: Detecting the provisioner...
	I0831 15:30:04.124351    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.124488    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.124590    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.124683    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.124778    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.124921    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.125101    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.125110    2876 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0831 15:30:04.190272    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0831 15:30:04.190323    2876 main.go:141] libmachine: found compatible host: buildroot
	I0831 15:30:04.190329    2876 main.go:141] libmachine: Provisioning with buildroot...
	I0831 15:30:04.190334    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:30:04.190465    2876 buildroot.go:166] provisioning hostname "ha-949000-m02"
	I0831 15:30:04.190476    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:30:04.190558    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.190652    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.190763    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.190844    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.190943    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.191068    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.191204    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.191213    2876 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m02 && echo "ha-949000-m02" | sudo tee /etc/hostname
	I0831 15:30:04.267934    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m02
	
	I0831 15:30:04.267948    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.268081    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.268202    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.268299    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.268391    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.268525    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.268665    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.268684    2876 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:30:04.340314    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:30:04.340330    2876 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:30:04.340340    2876 buildroot.go:174] setting up certificates
	I0831 15:30:04.340346    2876 provision.go:84] configureAuth start
	I0831 15:30:04.340353    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:30:04.340483    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:30:04.340577    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.340665    2876 provision.go:143] copyHostCerts
	I0831 15:30:04.340691    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:30:04.340751    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:30:04.340757    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:30:04.340904    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:30:04.341121    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:30:04.341161    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:30:04.341166    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:30:04.341243    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:30:04.341390    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:30:04.341427    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:30:04.341432    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:30:04.341508    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:30:04.341670    2876 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m02 san=[127.0.0.1 192.169.0.6 ha-949000-m02 localhost minikube]
	I0831 15:30:04.509456    2876 provision.go:177] copyRemoteCerts
	I0831 15:30:04.509508    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:30:04.509523    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.509674    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.509762    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.509874    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.509973    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:30:04.550810    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:30:04.550883    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:30:04.571982    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:30:04.572058    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:30:04.592601    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:30:04.592680    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0831 15:30:04.612516    2876 provision.go:87] duration metric: took 272.157929ms to configureAuth
	I0831 15:30:04.612531    2876 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:30:04.612691    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:30:04.612706    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:04.612851    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.612970    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.613064    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.613150    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.613227    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.613345    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.613483    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.613491    2876 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:30:04.678333    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:30:04.678345    2876 buildroot.go:70] root file system type: tmpfs
	I0831 15:30:04.678436    2876 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:30:04.678450    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.678582    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.678669    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.678767    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.678846    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.678978    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.679124    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.679167    2876 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:30:04.756204    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:30:04.756224    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.756411    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.756527    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.756630    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.756734    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.756851    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.757006    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.757027    2876 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:30:06.370825    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:30:06.370840    2876 main.go:141] libmachine: Checking connection to Docker...
	I0831 15:30:06.370855    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetURL
	I0831 15:30:06.370996    2876 main.go:141] libmachine: Docker is up and running!
	I0831 15:30:06.371003    2876 main.go:141] libmachine: Reticulating splines...
	I0831 15:30:06.371008    2876 client.go:171] duration metric: took 17.063185858s to LocalClient.Create
	I0831 15:30:06.371017    2876 start.go:167] duration metric: took 17.063218984s to libmachine.API.Create "ha-949000"
	I0831 15:30:06.371023    2876 start.go:293] postStartSetup for "ha-949000-m02" (driver="hyperkit")
	I0831 15:30:06.371029    2876 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:30:06.371039    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:06.371176    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:30:06.371190    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:06.371279    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:06.371365    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:06.371448    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:06.371522    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:30:06.410272    2876 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:30:06.413456    2876 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:30:06.413467    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:30:06.413573    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:30:06.413753    2876 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:30:06.413762    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:30:06.413962    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:30:06.421045    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:30:06.440540    2876 start.go:296] duration metric: took 69.508758ms for postStartSetup
	I0831 15:30:06.440562    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetConfigRaw
	I0831 15:30:06.441179    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:30:06.441343    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:30:06.441726    2876 start.go:128] duration metric: took 17.168146238s to createHost
	I0831 15:30:06.441741    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:06.441826    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:06.441909    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:06.442008    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:06.442102    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:06.442220    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:06.442339    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:06.442346    2876 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:30:06.507669    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143406.563138986
	
	I0831 15:30:06.507682    2876 fix.go:216] guest clock: 1725143406.563138986
	I0831 15:30:06.507687    2876 fix.go:229] Guest: 2024-08-31 15:30:06.563138986 -0700 PDT Remote: 2024-08-31 15:30:06.441735 -0700 PDT m=+57.202103081 (delta=121.403986ms)
	I0831 15:30:06.507698    2876 fix.go:200] guest clock delta is within tolerance: 121.403986ms
	I0831 15:30:06.507701    2876 start.go:83] releasing machines lock for "ha-949000-m02", held for 17.234244881s
	I0831 15:30:06.507719    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:06.507845    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:30:06.534518    2876 out.go:177] * Found network options:
	I0831 15:30:06.585154    2876 out.go:177]   - NO_PROXY=192.169.0.5
	W0831 15:30:06.608372    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:30:06.608434    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:06.609377    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:06.609624    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:06.609725    2876 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:30:06.609763    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	W0831 15:30:06.609837    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:30:06.609978    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:06.609993    2876 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:30:06.610018    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:06.610265    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:06.610300    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:06.610460    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:06.610487    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:06.610621    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:30:06.610643    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:06.610806    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	W0831 15:30:06.649012    2876 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:30:06.649075    2876 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:30:06.693849    2876 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:30:06.693863    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:30:06.693938    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:30:06.709316    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:30:06.718380    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:30:06.727543    2876 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:30:06.727609    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:30:06.736698    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:30:06.745615    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:30:06.755140    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:30:06.764398    2876 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:30:06.773464    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:30:06.782661    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:30:06.791918    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:30:06.801132    2876 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:30:06.809259    2876 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:30:06.817528    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:06.918051    2876 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:30:06.937658    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:30:06.937726    2876 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:30:06.952225    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:30:06.964364    2876 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:30:06.981641    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:30:06.992676    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:30:07.003746    2876 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:30:07.061399    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:30:07.071765    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:30:07.086915    2876 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:30:07.089960    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:30:07.097339    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:30:07.110902    2876 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:30:07.218878    2876 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:30:07.327438    2876 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:30:07.327478    2876 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:30:07.343077    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:07.455166    2876 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:30:09.753051    2876 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.297833346s)
	I0831 15:30:09.753112    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:30:09.763410    2876 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:30:09.776197    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:30:09.788015    2876 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:30:09.886287    2876 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:30:09.979666    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:10.091986    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:30:10.105474    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:30:10.116526    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:10.223654    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:30:10.284365    2876 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:30:10.284447    2876 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:30:10.288841    2876 start.go:563] Will wait 60s for crictl version
	I0831 15:30:10.288894    2876 ssh_runner.go:195] Run: which crictl
	I0831 15:30:10.292674    2876 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:30:10.327492    2876 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:30:10.327571    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:30:10.348428    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:30:10.394804    2876 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:30:10.438643    2876 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:30:10.460438    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:30:10.460677    2876 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:30:10.463911    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:30:10.474227    2876 mustload.go:65] Loading cluster: ha-949000
	I0831 15:30:10.474382    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:30:10.474620    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:30:10.474636    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:30:10.483465    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51091
	I0831 15:30:10.483852    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:30:10.484170    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:30:10.484182    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:30:10.484380    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:30:10.484504    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:30:10.484591    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:30:10.484661    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:30:10.485631    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:30:10.485888    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:30:10.485912    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:30:10.494468    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51093
	I0831 15:30:10.494924    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:30:10.495238    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:30:10.495250    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:30:10.495476    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:30:10.495585    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:30:10.495693    2876 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.6
	I0831 15:30:10.495700    2876 certs.go:194] generating shared ca certs ...
	I0831 15:30:10.495711    2876 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:30:10.495883    2876 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:30:10.495953    2876 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:30:10.495961    2876 certs.go:256] generating profile certs ...
	I0831 15:30:10.496069    2876 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:30:10.496092    2876 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.2cd83952
	I0831 15:30:10.496104    2876 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.2cd83952 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.254]
	I0831 15:30:10.585710    2876 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.2cd83952 ...
	I0831 15:30:10.585732    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.2cd83952: {Name:mkfd98043f041b827744dcc9a0bc27d9f7ba3a8d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:30:10.586080    2876 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.2cd83952 ...
	I0831 15:30:10.586093    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.2cd83952: {Name:mk6025bd0561394827636d384e273ec532f21510 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:30:10.586307    2876 certs.go:381] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.2cd83952 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt
	I0831 15:30:10.586527    2876 certs.go:385] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.2cd83952 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key
	I0831 15:30:10.586791    2876 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:30:10.586800    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:30:10.586823    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:30:10.586842    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:30:10.586860    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:30:10.586879    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:30:10.586902    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:30:10.586921    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:30:10.586939    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:30:10.587027    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:30:10.587073    2876 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:30:10.587082    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:30:10.587115    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:30:10.587145    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:30:10.587174    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:30:10.587237    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:30:10.587271    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:30:10.587293    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:30:10.587312    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:30:10.587343    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:30:10.587493    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:30:10.587598    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:30:10.587689    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:30:10.587790    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:30:10.619319    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0831 15:30:10.622586    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0831 15:30:10.631798    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0831 15:30:10.634863    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0831 15:30:10.644806    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0831 15:30:10.648392    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0831 15:30:10.657224    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0831 15:30:10.660506    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1679 bytes)
	I0831 15:30:10.668998    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0831 15:30:10.672282    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0831 15:30:10.681734    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0831 15:30:10.685037    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0831 15:30:10.697579    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:30:10.717100    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:30:10.736755    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:30:10.757074    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:30:10.776635    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I0831 15:30:10.796052    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0831 15:30:10.815309    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:30:10.834549    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:30:10.854663    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:30:10.873734    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:30:10.892872    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:30:10.912223    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0831 15:30:10.925669    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0831 15:30:10.939310    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0831 15:30:10.952723    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1679 bytes)
	I0831 15:30:10.966203    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0831 15:30:10.980670    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0831 15:30:10.994195    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0831 15:30:11.007818    2876 ssh_runner.go:195] Run: openssl version
	I0831 15:30:11.012076    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:30:11.021306    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:30:11.024674    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:30:11.024710    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:30:11.028962    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:30:11.038172    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:30:11.048226    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:30:11.051704    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:30:11.051746    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:30:11.056026    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:30:11.065281    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:30:11.074586    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:30:11.077977    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:30:11.078018    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:30:11.082263    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:30:11.091560    2876 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:30:11.094606    2876 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 15:30:11.094641    2876 kubeadm.go:934] updating node {m02 192.169.0.6 8443 v1.31.0 docker true true} ...
	I0831 15:30:11.094696    2876 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.6
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:30:11.094712    2876 kube-vip.go:115] generating kube-vip config ...
	I0831 15:30:11.094743    2876 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:30:11.107306    2876 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:30:11.107348    2876 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:30:11.107400    2876 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:30:11.116476    2876 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.31.0: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.31.0': No such file or directory
	
	Initiating transfer...
	I0831 15:30:11.116538    2876 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.31.0
	I0831 15:30:11.125199    2876 download.go:107] Downloading: https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubelet.sha256 -> /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet
	I0831 15:30:11.125199    2876 download.go:107] Downloading: https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl.sha256 -> /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl
	I0831 15:30:11.125202    2876 download.go:107] Downloading: https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubeadm.sha256 -> /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm
	I0831 15:30:13.495982    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl -> /var/lib/minikube/binaries/v1.31.0/kubectl
	I0831 15:30:13.496079    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubectl
	I0831 15:30:13.499639    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubectl: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubectl': No such file or directory
	I0831 15:30:13.499660    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl --> /var/lib/minikube/binaries/v1.31.0/kubectl (56381592 bytes)
	I0831 15:30:14.245316    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm -> /var/lib/minikube/binaries/v1.31.0/kubeadm
	I0831 15:30:14.245403    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubeadm
	I0831 15:30:14.249019    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubeadm: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubeadm': No such file or directory
	I0831 15:30:14.249045    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm --> /var/lib/minikube/binaries/v1.31.0/kubeadm (58290328 bytes)
	I0831 15:30:14.305452    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:30:14.335903    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet -> /var/lib/minikube/binaries/v1.31.0/kubelet
	I0831 15:30:14.336035    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubelet
	I0831 15:30:14.348689    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubelet: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubelet': No such file or directory
	I0831 15:30:14.348746    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet --> /var/lib/minikube/binaries/v1.31.0/kubelet (76865848 bytes)
	I0831 15:30:14.608960    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0831 15:30:14.617331    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:30:14.630716    2876 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:30:14.643952    2876 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:30:14.657665    2876 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:30:14.660616    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:30:14.670825    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:14.766762    2876 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:30:14.782036    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:30:14.782341    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:30:14.782363    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:30:14.791218    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51120
	I0831 15:30:14.791554    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:30:14.791943    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:30:14.791962    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:30:14.792169    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:30:14.792281    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:30:14.792379    2876 start.go:317] joinCluster: &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 Clu
sterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpira
tion:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:30:14.792482    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm token create --print-join-command --ttl=0"
	I0831 15:30:14.792500    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:30:14.792589    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:30:14.792677    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:30:14.792804    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:30:14.792889    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:30:14.904364    2876 start.go:343] trying to join control-plane node "m02" to cluster: &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:30:14.904404    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm join control-plane.minikube.internal:8443 --token sa5gl8.nk4lqkhvqrn6uouk --discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-949000-m02 --control-plane --apiserver-advertise-address=192.169.0.6 --apiserver-bind-port=8443"
	I0831 15:30:43.067719    2876 ssh_runner.go:235] Completed: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm join control-plane.minikube.internal:8443 --token sa5gl8.nk4lqkhvqrn6uouk --discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-949000-m02 --control-plane --apiserver-advertise-address=192.169.0.6 --apiserver-bind-port=8443": (28.162893612s)
	I0831 15:30:43.067762    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo systemctl daemon-reload && sudo systemctl enable kubelet && sudo systemctl start kubelet"
	I0831 15:30:43.495593    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-949000-m02 minikube.k8s.io/updated_at=2024_08_31T15_30_43_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2 minikube.k8s.io/name=ha-949000 minikube.k8s.io/primary=false
	I0831 15:30:43.584878    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig taint nodes ha-949000-m02 node-role.kubernetes.io/control-plane:NoSchedule-
	I0831 15:30:43.672222    2876 start.go:319] duration metric: took 28.879433845s to joinCluster
	I0831 15:30:43.672264    2876 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:30:43.672464    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:30:43.696001    2876 out.go:177] * Verifying Kubernetes components...
	I0831 15:30:43.753664    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:43.969793    2876 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:30:43.995704    2876 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:30:43.995955    2876 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x48c7c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:30:43.995999    2876 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:30:43.996168    2876 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m02" to be "Ready" ...
	I0831 15:30:43.996224    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:43.996229    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:43.996246    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:43.996253    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:44.008886    2876 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0831 15:30:44.496443    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:44.496458    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:44.496465    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:44.496468    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:44.499732    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:44.996970    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:44.996984    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:44.996990    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:44.996993    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:45.000189    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:45.496917    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:45.496930    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:45.496936    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:45.496939    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:45.498866    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:30:45.996558    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:45.996579    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:45.996604    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:45.996626    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:45.999357    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:45.999667    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:46.496895    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:46.496907    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:46.496914    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:46.496917    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:46.499220    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:46.996382    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:46.996397    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:46.996403    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:46.996406    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:46.998788    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:47.497035    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:47.497048    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:47.497055    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:47.497059    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:47.499487    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:47.996662    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:47.996675    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:47.996695    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:47.996699    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:47.998935    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:48.496588    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:48.496603    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:48.496610    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:48.496613    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:48.498806    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:48.499160    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:48.996774    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:48.996800    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:48.996806    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:48.996810    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:48.998862    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:49.496728    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:49.496741    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:49.496748    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:49.496753    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:49.500270    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:49.996536    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:49.996548    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:49.996555    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:49.996560    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:49.998977    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:50.496423    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:50.496441    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:50.496452    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:50.496458    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:50.499488    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:50.499941    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:50.996502    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:50.996515    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:50.996520    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:50.996525    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:50.998339    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:30:51.496978    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:51.496999    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:51.497011    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:51.497018    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:51.499859    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:51.997186    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:51.997200    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:51.997207    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:51.997210    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:52.000228    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:52.498065    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:52.498084    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:52.498093    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:52.498097    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:52.500425    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:52.500868    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:52.996733    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:52.996786    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:52.996804    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:52.996819    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:52.999878    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:53.496732    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:53.496752    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:53.496764    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:53.496772    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:53.499723    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:53.996635    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:53.996698    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:53.996722    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:53.996730    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:54.000327    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:54.496855    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:54.496875    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:54.496883    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:54.496888    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:54.499247    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:54.996676    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:54.996692    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:54.996701    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:54.996706    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:54.999066    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:54.999477    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:55.496949    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:55.496960    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:55.496967    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:55.496971    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:55.499074    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:55.996611    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:55.996627    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:55.996644    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:55.996651    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:55.999061    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:56.497363    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:56.497376    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:56.497383    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:56.497386    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:56.499540    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:56.997791    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:56.997810    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:56.997822    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:56.997828    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:57.001116    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:57.001481    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:57.497843    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:57.497862    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:57.497874    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:57.497881    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:57.500770    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:57.998298    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:57.998324    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:57.998335    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:57.998344    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:58.002037    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:58.496643    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:58.496664    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:58.496677    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:58.496683    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:58.499466    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:58.997398    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:58.997468    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:58.997484    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:58.997490    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:59.000768    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:59.498644    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:59.498668    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:59.498680    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:59.498685    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:59.502573    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:59.503046    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:59.996689    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:59.996715    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:59.996765    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:59.996773    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:59.999409    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:00.496654    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:00.496668    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.496677    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.496681    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.498585    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.499019    2876 node_ready.go:49] node "ha-949000-m02" has status "Ready":"True"
	I0831 15:31:00.499031    2876 node_ready.go:38] duration metric: took 16.50261118s for node "ha-949000-m02" to be "Ready" ...
	I0831 15:31:00.499038    2876 pod_ready.go:36] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:31:00.499081    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:31:00.499087    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.499092    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.499095    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.502205    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:00.506845    2876 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.506892    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:31:00.506897    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.506903    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.506908    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.508659    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.509078    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:00.509085    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.509091    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.509094    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.510447    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.510831    2876 pod_ready.go:93] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:00.510839    2876 pod_ready.go:82] duration metric: took 3.983743ms for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.510852    2876 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.510887    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-snq8s
	I0831 15:31:00.510892    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.510897    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.510901    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.512274    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.512740    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:00.512747    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.512752    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.512757    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.514085    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.514446    2876 pod_ready.go:93] pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:00.514457    2876 pod_ready.go:82] duration metric: took 3.596287ms for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.514464    2876 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.514501    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000
	I0831 15:31:00.514506    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.514512    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.514515    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.517897    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:00.518307    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:00.518314    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.518320    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.518324    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.519756    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.520128    2876 pod_ready.go:93] pod "etcd-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:00.520138    2876 pod_ready.go:82] duration metric: took 5.668748ms for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.520144    2876 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.520177    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m02
	I0831 15:31:00.520182    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.520187    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.520191    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.521454    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.521852    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:00.521860    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.521865    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.521870    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.523054    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.523372    2876 pod_ready.go:93] pod "etcd-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:00.523381    2876 pod_ready.go:82] duration metric: took 3.231682ms for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.523393    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.698293    2876 request.go:632] Waited for 174.813181ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:31:00.698344    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:31:00.698420    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.698432    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.698439    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.701539    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:00.897673    2876 request.go:632] Waited for 195.424003ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:00.897783    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:00.897794    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.897805    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.897814    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.900981    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:00.901407    2876 pod_ready.go:93] pod "kube-apiserver-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:00.901419    2876 pod_ready.go:82] duration metric: took 378.015429ms for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.901429    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:01.097805    2876 request.go:632] Waited for 196.320526ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:31:01.097926    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:31:01.097936    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:01.097947    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:01.097955    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:01.100563    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:01.298122    2876 request.go:632] Waited for 197.162644ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:01.298157    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:01.298162    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:01.298168    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:01.298172    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:01.300402    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:01.300781    2876 pod_ready.go:93] pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:01.300791    2876 pod_ready.go:82] duration metric: took 399.34942ms for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:01.300807    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:01.497316    2876 request.go:632] Waited for 196.39746ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:31:01.497376    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:31:01.497387    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:01.497397    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:01.497405    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:01.500651    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:01.698231    2876 request.go:632] Waited for 196.759957ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:01.698322    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:01.698333    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:01.698344    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:01.698353    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:01.701256    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:01.701766    2876 pod_ready.go:93] pod "kube-controller-manager-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:01.701775    2876 pod_ready.go:82] duration metric: took 400.954779ms for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:01.701785    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:01.898783    2876 request.go:632] Waited for 196.946643ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:31:01.898903    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:31:01.898917    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:01.898929    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:01.898938    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:01.902347    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:02.097749    2876 request.go:632] Waited for 194.738931ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:02.097815    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:02.097824    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:02.097834    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:02.097843    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:02.101525    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:02.102016    2876 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:02.102028    2876 pod_ready.go:82] duration metric: took 400.230387ms for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:02.102037    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:02.296929    2876 request.go:632] Waited for 194.771963ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:31:02.296979    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:31:02.296996    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:02.297010    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:02.297016    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:02.300518    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:02.498356    2876 request.go:632] Waited for 197.140595ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:02.498409    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:02.498414    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:02.498421    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:02.498425    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:02.500151    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:02.500554    2876 pod_ready.go:93] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:02.500564    2876 pod_ready.go:82] duration metric: took 398.515508ms for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:02.500577    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:02.697756    2876 request.go:632] Waited for 197.121926ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:31:02.697847    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:31:02.697859    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:02.697871    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:02.697879    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:02.701227    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:02.896975    2876 request.go:632] Waited for 195.16614ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:02.897029    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:02.897044    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:02.897050    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:02.897054    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:02.899135    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:02.899494    2876 pod_ready.go:93] pod "kube-proxy-q7ndn" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:02.899504    2876 pod_ready.go:82] duration metric: took 398.915896ms for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:02.899511    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:03.098441    2876 request.go:632] Waited for 198.871316ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:31:03.098576    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:31:03.098587    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.098599    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.098606    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.101995    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:03.297740    2876 request.go:632] Waited for 194.927579ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:03.297801    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:03.297842    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.297855    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.297863    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.300956    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:03.301560    2876 pod_ready.go:93] pod "kube-scheduler-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:03.301572    2876 pod_ready.go:82] duration metric: took 402.049602ms for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:03.301580    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:03.498380    2876 request.go:632] Waited for 196.707011ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:31:03.498472    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:31:03.498482    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.498494    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.498505    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.502174    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:03.696864    2876 request.go:632] Waited for 194.200989ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:03.696916    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:03.696926    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.696938    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.696944    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.700327    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:03.700769    2876 pod_ready.go:93] pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:03.700782    2876 pod_ready.go:82] duration metric: took 399.189338ms for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:03.700791    2876 pod_ready.go:39] duration metric: took 3.201699285s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:31:03.700816    2876 api_server.go:52] waiting for apiserver process to appear ...
	I0831 15:31:03.700877    2876 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:31:03.712528    2876 api_server.go:72] duration metric: took 20.039964419s to wait for apiserver process to appear ...
	I0831 15:31:03.712539    2876 api_server.go:88] waiting for apiserver healthz status ...
	I0831 15:31:03.712554    2876 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0831 15:31:03.715722    2876 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0831 15:31:03.715760    2876 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0831 15:31:03.715765    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.715771    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.715775    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.716371    2876 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 15:31:03.716424    2876 api_server.go:141] control plane version: v1.31.0
	I0831 15:31:03.716433    2876 api_server.go:131] duration metric: took 3.890107ms to wait for apiserver health ...
	I0831 15:31:03.716440    2876 system_pods.go:43] waiting for kube-system pods to appear ...
	I0831 15:31:03.898331    2876 request.go:632] Waited for 181.827666ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:31:03.898385    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:31:03.898446    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.898465    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.898473    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.903436    2876 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:31:03.906746    2876 system_pods.go:59] 17 kube-system pods found
	I0831 15:31:03.906767    2876 system_pods.go:61] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:31:03.906771    2876 system_pods.go:61] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:31:03.906775    2876 system_pods.go:61] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:31:03.906778    2876 system_pods.go:61] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:31:03.906783    2876 system_pods.go:61] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:31:03.906786    2876 system_pods.go:61] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:31:03.906789    2876 system_pods.go:61] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:31:03.906793    2876 system_pods.go:61] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:31:03.906796    2876 system_pods.go:61] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:31:03.906799    2876 system_pods.go:61] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:31:03.906802    2876 system_pods.go:61] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:31:03.906805    2876 system_pods.go:61] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:31:03.906810    2876 system_pods.go:61] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:31:03.906814    2876 system_pods.go:61] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:31:03.906816    2876 system_pods.go:61] "kube-vip-ha-949000" [933b8e54-299e-44c1-8dea-69aba92adbd4] Running
	I0831 15:31:03.906819    2876 system_pods.go:61] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:31:03.906824    2876 system_pods.go:61] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:31:03.906830    2876 system_pods.go:74] duration metric: took 190.381994ms to wait for pod list to return data ...
	I0831 15:31:03.906835    2876 default_sa.go:34] waiting for default service account to be created ...
	I0831 15:31:04.096833    2876 request.go:632] Waited for 189.933385ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:31:04.096919    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:31:04.096929    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:04.096940    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:04.096947    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:04.100750    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:04.100942    2876 default_sa.go:45] found service account: "default"
	I0831 15:31:04.100955    2876 default_sa.go:55] duration metric: took 194.103228ms for default service account to be created ...
	I0831 15:31:04.100963    2876 system_pods.go:116] waiting for k8s-apps to be running ...
	I0831 15:31:04.297283    2876 request.go:632] Waited for 196.269925ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:31:04.297349    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:31:04.297359    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:04.297370    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:04.297380    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:04.301594    2876 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:31:04.305403    2876 system_pods.go:86] 17 kube-system pods found
	I0831 15:31:04.305414    2876 system_pods.go:89] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:31:04.305418    2876 system_pods.go:89] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:31:04.305421    2876 system_pods.go:89] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:31:04.305424    2876 system_pods.go:89] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:31:04.305427    2876 system_pods.go:89] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:31:04.305431    2876 system_pods.go:89] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:31:04.305434    2876 system_pods.go:89] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:31:04.305438    2876 system_pods.go:89] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:31:04.305440    2876 system_pods.go:89] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:31:04.305443    2876 system_pods.go:89] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:31:04.305446    2876 system_pods.go:89] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:31:04.305449    2876 system_pods.go:89] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:31:04.305452    2876 system_pods.go:89] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:31:04.305455    2876 system_pods.go:89] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:31:04.305457    2876 system_pods.go:89] "kube-vip-ha-949000" [933b8e54-299e-44c1-8dea-69aba92adbd4] Running
	I0831 15:31:04.305459    2876 system_pods.go:89] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:31:04.305462    2876 system_pods.go:89] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:31:04.305467    2876 system_pods.go:126] duration metric: took 204.496865ms to wait for k8s-apps to be running ...
	I0831 15:31:04.305472    2876 system_svc.go:44] waiting for kubelet service to be running ....
	I0831 15:31:04.305532    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:31:04.316332    2876 system_svc.go:56] duration metric: took 10.855844ms WaitForService to wait for kubelet
	I0831 15:31:04.316347    2876 kubeadm.go:582] duration metric: took 20.643776408s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:31:04.316359    2876 node_conditions.go:102] verifying NodePressure condition ...
	I0831 15:31:04.497360    2876 request.go:632] Waited for 180.939277ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0831 15:31:04.497396    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0831 15:31:04.497400    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:04.497406    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:04.497409    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:04.500112    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:04.500615    2876 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:31:04.500630    2876 node_conditions.go:123] node cpu capacity is 2
	I0831 15:31:04.500640    2876 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:31:04.500644    2876 node_conditions.go:123] node cpu capacity is 2
	I0831 15:31:04.500647    2876 node_conditions.go:105] duration metric: took 184.28246ms to run NodePressure ...
	I0831 15:31:04.500655    2876 start.go:241] waiting for startup goroutines ...
	I0831 15:31:04.500673    2876 start.go:255] writing updated cluster config ...
	I0831 15:31:04.522012    2876 out.go:201] 
	I0831 15:31:04.543188    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:31:04.543261    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:31:04.565062    2876 out.go:177] * Starting "ha-949000-m03" control-plane node in "ha-949000" cluster
	I0831 15:31:04.608029    2876 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:31:04.608097    2876 cache.go:56] Caching tarball of preloaded images
	I0831 15:31:04.608326    2876 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:31:04.608349    2876 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:31:04.608480    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:31:04.609474    2876 start.go:360] acquireMachinesLock for ha-949000-m03: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:31:04.609608    2876 start.go:364] duration metric: took 107.158µs to acquireMachinesLock for "ha-949000-m03"
	I0831 15:31:04.609644    2876 start.go:93] Provisioning new machine with config: &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ing
ress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror:
DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m03 IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:31:04.609770    2876 start.go:125] createHost starting for "m03" (driver="hyperkit")
	I0831 15:31:04.631012    2876 out.go:235] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0831 15:31:04.631142    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:31:04.631178    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:31:04.640831    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51128
	I0831 15:31:04.641212    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:31:04.641538    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:31:04.641551    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:31:04.641754    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:31:04.641864    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:31:04.641951    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:04.642054    2876 start.go:159] libmachine.API.Create for "ha-949000" (driver="hyperkit")
	I0831 15:31:04.642071    2876 client.go:168] LocalClient.Create starting
	I0831 15:31:04.642111    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem
	I0831 15:31:04.642169    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:31:04.642179    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:31:04.642217    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem
	I0831 15:31:04.642255    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:31:04.642264    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:31:04.642276    2876 main.go:141] libmachine: Running pre-create checks...
	I0831 15:31:04.642281    2876 main.go:141] libmachine: (ha-949000-m03) Calling .PreCreateCheck
	I0831 15:31:04.642379    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:04.642422    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetConfigRaw
	I0831 15:31:04.652222    2876 main.go:141] libmachine: Creating machine...
	I0831 15:31:04.652235    2876 main.go:141] libmachine: (ha-949000-m03) Calling .Create
	I0831 15:31:04.652380    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:04.652531    2876 main.go:141] libmachine: (ha-949000-m03) DBG | I0831 15:31:04.652372    3223 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:31:04.652595    2876 main.go:141] libmachine: (ha-949000-m03) Downloading /Users/jenkins/minikube-integration/18943-957/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso...
	I0831 15:31:04.967913    2876 main.go:141] libmachine: (ha-949000-m03) DBG | I0831 15:31:04.967796    3223 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa...
	I0831 15:31:05.218214    2876 main.go:141] libmachine: (ha-949000-m03) DBG | I0831 15:31:05.218148    3223 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/ha-949000-m03.rawdisk...
	I0831 15:31:05.218234    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Writing magic tar header
	I0831 15:31:05.218243    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Writing SSH key tar header
	I0831 15:31:05.219245    2876 main.go:141] libmachine: (ha-949000-m03) DBG | I0831 15:31:05.219093    3223 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03 ...
	I0831 15:31:05.777334    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:05.777394    2876 main.go:141] libmachine: (ha-949000-m03) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid
	I0831 15:31:05.777478    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Using UUID 3fdefe95-7552-4d5b-8412-6ae6e5c787bb
	I0831 15:31:05.805053    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Generated MAC fa:59:9e:3b:35:6d
	I0831 15:31:05.805071    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:31:05.805106    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"3fdefe95-7552-4d5b-8412-6ae6e5c787bb", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00011a5d0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:31:05.805131    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"3fdefe95-7552-4d5b-8412-6ae6e5c787bb", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00011a5d0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:31:05.805226    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "3fdefe95-7552-4d5b-8412-6ae6e5c787bb", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/ha-949000-m03.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:31:05.805279    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 3fdefe95-7552-4d5b-8412-6ae6e5c787bb -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/ha-949000-m03.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:31:05.805308    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:31:05.808244    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: Pid is 3227
	I0831 15:31:05.808817    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 0
	I0831 15:31:05.808830    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:05.808902    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:05.809826    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:05.809929    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:31:05.809949    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:31:05.809975    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:31:05.809992    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:31:05.810004    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:31:05.810013    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:31:05.816053    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:31:05.824689    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:31:05.825475    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:31:05.825495    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:31:05.825508    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:31:05.825518    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:31:06.214670    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:31:06.214691    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:31:06.330054    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:31:06.330074    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:31:06.330102    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:31:06.330119    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:31:06.330929    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:31:06.330943    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:31:07.810124    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 1
	I0831 15:31:07.810138    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:07.810246    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:07.811007    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:07.811057    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:31:07.811067    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:31:07.811076    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:31:07.811082    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:31:07.811088    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:31:07.811097    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:31:09.811187    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 2
	I0831 15:31:09.811200    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:09.811312    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:09.812186    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:09.812196    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:31:09.812205    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:31:09.812213    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:31:09.812234    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:31:09.812241    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:31:09.812249    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:31:11.813365    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 3
	I0831 15:31:11.813388    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:11.813446    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:11.814261    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:11.814310    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:31:11.814328    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:31:11.814337    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:31:11.814342    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:31:11.814361    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:31:11.814371    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:31:11.957428    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:11 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:31:11.957483    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:11 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:31:11.957496    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:11 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:31:11.981309    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:11 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:31:13.815231    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 4
	I0831 15:31:13.815245    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:13.815334    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:13.816118    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:13.816176    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:31:13.816186    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:31:13.816194    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:31:13.816200    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:31:13.816208    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:31:13.816220    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:31:15.816252    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 5
	I0831 15:31:15.816273    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:15.816393    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:15.817241    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:15.817305    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 6 entries in /var/db/dhcpd_leases!
	I0831 15:31:15.817315    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d4eb32}
	I0831 15:31:15.817332    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found match: fa:59:9e:3b:35:6d
	I0831 15:31:15.817339    2876 main.go:141] libmachine: (ha-949000-m03) DBG | IP: 192.169.0.7
	I0831 15:31:15.817379    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetConfigRaw
	I0831 15:31:15.817997    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:15.818096    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:15.818188    2876 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0831 15:31:15.818195    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetState
	I0831 15:31:15.818279    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:15.818331    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:15.819115    2876 main.go:141] libmachine: Detecting operating system of created instance...
	I0831 15:31:15.819122    2876 main.go:141] libmachine: Waiting for SSH to be available...
	I0831 15:31:15.819126    2876 main.go:141] libmachine: Getting to WaitForSSH function...
	I0831 15:31:15.819130    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:15.819211    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:15.819288    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:15.819367    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:15.819433    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:15.819544    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:15.819737    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:15.819744    2876 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0831 15:31:16.864414    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:31:16.864428    2876 main.go:141] libmachine: Detecting the provisioner...
	I0831 15:31:16.864434    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:16.864597    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:16.864686    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.864782    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.864877    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:16.865009    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:16.865163    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:16.865170    2876 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0831 15:31:16.911810    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0831 15:31:16.911850    2876 main.go:141] libmachine: found compatible host: buildroot
	I0831 15:31:16.911857    2876 main.go:141] libmachine: Provisioning with buildroot...
	I0831 15:31:16.911862    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:31:16.911989    2876 buildroot.go:166] provisioning hostname "ha-949000-m03"
	I0831 15:31:16.911998    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:31:16.912088    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:16.912161    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:16.912247    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.912323    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.912399    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:16.912532    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:16.912676    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:16.912685    2876 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m03 && echo "ha-949000-m03" | sudo tee /etc/hostname
	I0831 15:31:16.972401    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m03
	
	I0831 15:31:16.972418    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:16.972554    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:16.972683    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.972793    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.972889    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:16.973016    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:16.973150    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:16.973161    2876 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m03' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m03/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m03' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:31:17.026608    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:31:17.026626    2876 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:31:17.026635    2876 buildroot.go:174] setting up certificates
	I0831 15:31:17.026641    2876 provision.go:84] configureAuth start
	I0831 15:31:17.026647    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:31:17.026793    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:31:17.026903    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:17.026995    2876 provision.go:143] copyHostCerts
	I0831 15:31:17.027029    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:31:17.027088    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:31:17.027094    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:31:17.027236    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:31:17.027433    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:31:17.027471    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:31:17.027477    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:31:17.027559    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:31:17.027700    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:31:17.027737    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:31:17.027742    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:31:17.027813    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:31:17.027956    2876 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m03 san=[127.0.0.1 192.169.0.7 ha-949000-m03 localhost minikube]
	I0831 15:31:17.258292    2876 provision.go:177] copyRemoteCerts
	I0831 15:31:17.258340    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:31:17.258353    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:17.258490    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:17.258583    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.258663    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:17.258746    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:31:17.289869    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:31:17.289967    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:31:17.308984    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:31:17.309048    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0831 15:31:17.328947    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:31:17.329010    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:31:17.348578    2876 provision.go:87] duration metric: took 321.944434ms to configureAuth
	I0831 15:31:17.348592    2876 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:31:17.348776    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:31:17.348791    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:17.348926    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:17.349023    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:17.349112    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.349190    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.349267    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:17.349365    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:17.349505    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:17.349513    2876 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:31:17.396974    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:31:17.396988    2876 buildroot.go:70] root file system type: tmpfs
	I0831 15:31:17.397075    2876 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:31:17.397087    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:17.397218    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:17.397314    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.397402    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.397507    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:17.397637    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:17.397789    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:17.397838    2876 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:31:17.455821    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:31:17.455842    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:17.455977    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:17.456072    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.456168    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.456252    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:17.456374    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:17.456520    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:17.456533    2876 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:31:19.032300    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:31:19.032316    2876 main.go:141] libmachine: Checking connection to Docker...
	I0831 15:31:19.032323    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetURL
	I0831 15:31:19.032456    2876 main.go:141] libmachine: Docker is up and running!
	I0831 15:31:19.032464    2876 main.go:141] libmachine: Reticulating splines...
	I0831 15:31:19.032468    2876 client.go:171] duration metric: took 14.391172658s to LocalClient.Create
	I0831 15:31:19.032480    2876 start.go:167] duration metric: took 14.391215349s to libmachine.API.Create "ha-949000"
	I0831 15:31:19.032489    2876 start.go:293] postStartSetup for "ha-949000-m03" (driver="hyperkit")
	I0831 15:31:19.032496    2876 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:31:19.032506    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:19.032660    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:31:19.032675    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:19.032767    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:19.032855    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:19.032947    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:19.033033    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:31:19.073938    2876 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:31:19.079886    2876 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:31:19.079901    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:31:19.080017    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:31:19.080199    2876 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:31:19.080206    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:31:19.080413    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:31:19.092434    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:31:19.119963    2876 start.go:296] duration metric: took 87.46929ms for postStartSetup
	I0831 15:31:19.119990    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetConfigRaw
	I0831 15:31:19.120591    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:31:19.120767    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:31:19.121161    2876 start.go:128] duration metric: took 14.512164484s to createHost
	I0831 15:31:19.121177    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:19.121269    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:19.121343    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:19.121419    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:19.121507    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:19.121631    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:19.121747    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:19.121754    2876 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:31:19.168319    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143479.023948613
	
	I0831 15:31:19.168331    2876 fix.go:216] guest clock: 1725143479.023948613
	I0831 15:31:19.168337    2876 fix.go:229] Guest: 2024-08-31 15:31:19.023948613 -0700 PDT Remote: 2024-08-31 15:31:19.12117 -0700 PDT m=+129.881500927 (delta=-97.221387ms)
	I0831 15:31:19.168349    2876 fix.go:200] guest clock delta is within tolerance: -97.221387ms
	I0831 15:31:19.168354    2876 start.go:83] releasing machines lock for "ha-949000-m03", held for 14.559521208s
	I0831 15:31:19.168370    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:19.168508    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:31:19.193570    2876 out.go:177] * Found network options:
	I0831 15:31:19.255565    2876 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6
	W0831 15:31:19.295062    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:31:19.295088    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:31:19.295104    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:19.295822    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:19.296008    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:19.296101    2876 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:31:19.296130    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	W0831 15:31:19.296153    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:31:19.296165    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:31:19.296225    2876 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:31:19.296229    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:19.296236    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:19.296334    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:19.296350    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:19.296442    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:19.296455    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:19.296560    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:31:19.296581    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:19.296680    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	W0831 15:31:19.323572    2876 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:31:19.323629    2876 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:31:19.371272    2876 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:31:19.371294    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:31:19.371393    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:31:19.387591    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:31:19.396789    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:31:19.405160    2876 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:31:19.405208    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:31:19.413496    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:31:19.422096    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:31:19.430386    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:31:19.438699    2876 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:31:19.447187    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:31:19.455984    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:31:19.464947    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:31:19.474438    2876 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:31:19.482528    2876 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:31:19.490487    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:19.582349    2876 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:31:19.599985    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:31:19.600056    2876 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:31:19.612555    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:31:19.632269    2876 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:31:19.650343    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:31:19.661102    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:31:19.671812    2876 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:31:19.695791    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:31:19.706786    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:31:19.722246    2876 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:31:19.725125    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:31:19.732176    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:31:19.745845    2876 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:31:19.848832    2876 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:31:19.960260    2876 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:31:19.960281    2876 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:31:19.974005    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:20.073538    2876 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:31:22.469978    2876 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.396488217s)
	I0831 15:31:22.470044    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:31:22.482132    2876 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:31:22.494892    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:31:22.505113    2876 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:31:22.597737    2876 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:31:22.715451    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:22.823995    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:31:22.837904    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:31:22.849106    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:22.943937    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:31:23.002374    2876 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:31:23.002452    2876 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:31:23.006859    2876 start.go:563] Will wait 60s for crictl version
	I0831 15:31:23.006916    2876 ssh_runner.go:195] Run: which crictl
	I0831 15:31:23.010129    2876 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:31:23.037227    2876 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:31:23.037307    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:31:23.056021    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:31:23.095679    2876 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:31:23.119303    2876 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:31:23.162269    2876 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0831 15:31:23.183203    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:31:23.183553    2876 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:31:23.187788    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:31:23.197219    2876 mustload.go:65] Loading cluster: ha-949000
	I0831 15:31:23.197405    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:31:23.197647    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:31:23.197669    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:31:23.206705    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51151
	I0831 15:31:23.207061    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:31:23.207432    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:31:23.207448    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:31:23.207666    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:31:23.207786    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:31:23.207874    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:23.207946    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:31:23.208928    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:31:23.209186    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:31:23.209220    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:31:23.218074    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51153
	I0831 15:31:23.218433    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:31:23.218804    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:31:23.218819    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:31:23.219039    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:31:23.219165    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:31:23.219284    2876 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.7
	I0831 15:31:23.219289    2876 certs.go:194] generating shared ca certs ...
	I0831 15:31:23.219301    2876 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:31:23.219493    2876 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:31:23.219569    2876 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:31:23.219578    2876 certs.go:256] generating profile certs ...
	I0831 15:31:23.219685    2876 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:31:23.219705    2876 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.0c0868f3
	I0831 15:31:23.219719    2876 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.0c0868f3 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.7 192.169.0.254]
	I0831 15:31:23.437317    2876 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.0c0868f3 ...
	I0831 15:31:23.437340    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.0c0868f3: {Name:mk58aa028a0f003ebc9e4d90dc317cdac139f88f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:31:23.437643    2876 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.0c0868f3 ...
	I0831 15:31:23.437656    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.0c0868f3: {Name:mkaffb8ad3060932ca991ed93b1f8350d31a48ee Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:31:23.437859    2876 certs.go:381] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.0c0868f3 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt
	I0831 15:31:23.438064    2876 certs.go:385] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.0c0868f3 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key
	I0831 15:31:23.438321    2876 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:31:23.438330    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:31:23.438352    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:31:23.438370    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:31:23.438423    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:31:23.438445    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:31:23.438467    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:31:23.438484    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:31:23.438502    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:31:23.438598    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:31:23.438648    2876 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:31:23.438657    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:31:23.438698    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:31:23.438737    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:31:23.438775    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:31:23.438861    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:31:23.438902    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:31:23.438923    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:31:23.438941    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:31:23.438970    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:31:23.439126    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:31:23.439259    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:31:23.439370    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:31:23.439494    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:31:23.472129    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0831 15:31:23.475604    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0831 15:31:23.483468    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0831 15:31:23.486771    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0831 15:31:23.494732    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0831 15:31:23.497856    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0831 15:31:23.505900    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0831 15:31:23.509221    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1679 bytes)
	I0831 15:31:23.517853    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0831 15:31:23.521110    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0831 15:31:23.529522    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0831 15:31:23.532921    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0831 15:31:23.540561    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:31:23.560999    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:31:23.580941    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:31:23.601890    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:31:23.621742    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1444 bytes)
	I0831 15:31:23.642294    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0831 15:31:23.662119    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:31:23.682734    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:31:23.702621    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:31:23.722704    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:31:23.743032    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:31:23.763003    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0831 15:31:23.776540    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0831 15:31:23.790112    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0831 15:31:23.803743    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1679 bytes)
	I0831 15:31:23.817470    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0831 15:31:23.831871    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0831 15:31:23.845310    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0831 15:31:23.858947    2876 ssh_runner.go:195] Run: openssl version
	I0831 15:31:23.863254    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:31:23.871668    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:31:23.875114    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:31:23.875147    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:31:23.879499    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:31:23.888263    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:31:23.896800    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:31:23.900783    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:31:23.900840    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:31:23.905239    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:31:23.913677    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:31:23.921998    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:31:23.925382    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:31:23.925421    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:31:23.929547    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:31:23.938211    2876 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:31:23.941244    2876 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 15:31:23.941280    2876 kubeadm.go:934] updating node {m03 192.169.0.7 8443 v1.31.0 docker true true} ...
	I0831 15:31:23.941346    2876 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m03 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.7
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:31:23.941365    2876 kube-vip.go:115] generating kube-vip config ...
	I0831 15:31:23.941403    2876 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:31:23.953552    2876 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:31:23.953594    2876 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:31:23.953640    2876 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:31:23.961797    2876 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.31.0: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.31.0': No such file or directory
	
	Initiating transfer...
	I0831 15:31:23.961850    2876 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.31.0
	I0831 15:31:23.970244    2876 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubeadm.sha256
	I0831 15:31:23.970245    2876 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl.sha256
	I0831 15:31:23.970248    2876 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubelet.sha256
	I0831 15:31:23.970260    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm -> /var/lib/minikube/binaries/v1.31.0/kubeadm
	I0831 15:31:23.970262    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl -> /var/lib/minikube/binaries/v1.31.0/kubectl
	I0831 15:31:23.970297    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:31:23.970351    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubeadm
	I0831 15:31:23.970358    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubectl
	I0831 15:31:23.982898    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet -> /var/lib/minikube/binaries/v1.31.0/kubelet
	I0831 15:31:23.982926    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubeadm: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubeadm': No such file or directory
	I0831 15:31:23.982950    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm --> /var/lib/minikube/binaries/v1.31.0/kubeadm (58290328 bytes)
	I0831 15:31:23.982949    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubectl: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubectl': No such file or directory
	I0831 15:31:23.982968    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl --> /var/lib/minikube/binaries/v1.31.0/kubectl (56381592 bytes)
	I0831 15:31:23.983039    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubelet
	I0831 15:31:24.006648    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubelet: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubelet': No such file or directory
	I0831 15:31:24.006684    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet --> /var/lib/minikube/binaries/v1.31.0/kubelet (76865848 bytes)
	I0831 15:31:24.520609    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0831 15:31:24.528302    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:31:24.542845    2876 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:31:24.556549    2876 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:31:24.581157    2876 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:31:24.584179    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:31:24.593696    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:24.689916    2876 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:31:24.707403    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:31:24.707700    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:31:24.707728    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:31:24.717047    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51156
	I0831 15:31:24.717380    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:31:24.717728    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:31:24.717743    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:31:24.718003    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:31:24.718123    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:31:24.718213    2876 start.go:317] joinCluster: &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 Clu
sterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:fals
e inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimi
zations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:31:24.718336    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm token create --print-join-command --ttl=0"
	I0831 15:31:24.718349    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:31:24.718430    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:31:24.718495    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:31:24.718573    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:31:24.718638    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:31:24.810129    2876 start.go:343] trying to join control-plane node "m03" to cluster: &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:31:24.810181    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm join control-plane.minikube.internal:8443 --token l0ka7f.9kdk1py3wyogvy9t --discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-949000-m03 --control-plane --apiserver-advertise-address=192.169.0.7 --apiserver-bind-port=8443"
	I0831 15:31:52.526613    2876 ssh_runner.go:235] Completed: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm join control-plane.minikube.internal:8443 --token l0ka7f.9kdk1py3wyogvy9t --discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-949000-m03 --control-plane --apiserver-advertise-address=192.169.0.7 --apiserver-bind-port=8443": (27.716564604s)
	I0831 15:31:52.526639    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo systemctl daemon-reload && sudo systemctl enable kubelet && sudo systemctl start kubelet"
	I0831 15:31:53.011028    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-949000-m03 minikube.k8s.io/updated_at=2024_08_31T15_31_53_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2 minikube.k8s.io/name=ha-949000 minikube.k8s.io/primary=false
	I0831 15:31:53.087862    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig taint nodes ha-949000-m03 node-role.kubernetes.io/control-plane:NoSchedule-
	I0831 15:31:53.172826    2876 start.go:319] duration metric: took 28.454760565s to joinCluster
	I0831 15:31:53.172884    2876 start.go:235] Will wait 6m0s for node &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:31:53.173075    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:31:53.197446    2876 out.go:177] * Verifying Kubernetes components...
	I0831 15:31:53.254031    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:53.535623    2876 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:31:53.558317    2876 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:31:53.558557    2876 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x48c7c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:31:53.558593    2876 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:31:53.558836    2876 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m03" to be "Ready" ...
	I0831 15:31:53.558893    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:53.558899    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:53.558906    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:53.558909    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:53.561151    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:54.058994    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:54.059009    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:54.059015    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:54.059020    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:54.061381    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:54.559376    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:54.559389    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:54.559396    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:54.559399    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:54.561772    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:55.059628    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:55.059676    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:55.059690    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:55.059700    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:55.063078    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:55.559418    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:55.559433    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:55.559439    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:55.559442    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:55.561338    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:55.561664    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:31:56.059758    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:56.059770    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:56.059776    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:56.059780    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:56.061794    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:56.560083    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:56.560095    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:56.560101    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:56.560105    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:56.562114    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:57.058995    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:57.059011    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:57.059017    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:57.059021    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:57.060963    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:57.560137    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:57.560149    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:57.560155    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:57.560159    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:57.561978    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:57.562328    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:31:58.059061    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:58.059074    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:58.059080    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:58.059086    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:58.061472    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:58.559244    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:58.559270    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:58.559282    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:58.559289    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:58.562722    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:59.060308    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:59.060330    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:59.060342    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:59.060359    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:59.063517    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:59.560099    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:59.560116    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:59.560125    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:59.560129    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:59.562184    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:59.562628    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:00.059591    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:00.059615    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:00.059662    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:00.059677    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:00.063389    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:00.560430    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:00.560444    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:00.560451    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:00.560455    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:00.562483    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:01.059473    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:01.059498    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:01.059509    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:01.059514    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:01.062773    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:01.559271    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:01.559298    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:01.559310    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:01.559317    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:01.562641    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:01.563242    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:02.060140    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:02.060168    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:02.060211    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:02.060244    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:02.063601    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:02.559282    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:02.559308    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:02.559320    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:02.559329    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:02.562623    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:03.059890    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:03.059911    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:03.059923    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:03.059930    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:03.063409    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:03.559394    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:03.559453    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:03.559465    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:03.559470    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:03.562567    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:04.060698    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:04.060714    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:04.060719    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:04.060727    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:04.062955    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:04.063278    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:04.560096    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:04.560118    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:04.560165    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:04.560173    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:04.562791    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:05.060622    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:05.060648    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:05.060659    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:05.060665    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:05.064011    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:05.559954    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:05.559976    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:05.559988    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:05.559994    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:05.563422    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:06.059812    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:06.059870    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:06.059880    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:06.059886    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:06.062529    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:06.560071    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:06.560096    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:06.560107    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:06.560113    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:06.563538    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:06.564037    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:07.059298    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:07.059324    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:07.059335    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:07.059342    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:07.063048    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:07.559252    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:07.559277    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:07.559291    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:07.559297    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:07.562373    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:08.061149    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:08.061210    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:08.061223    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:08.061234    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:08.064402    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:08.559428    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:08.559452    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:08.559463    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:08.559468    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:08.562526    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:09.060827    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:09.060878    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:09.060891    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:09.060900    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:09.063954    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:09.064537    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:09.561212    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:09.561237    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:09.561283    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:09.561292    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:09.564677    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:10.060675    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:10.060694    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:10.060714    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:10.060718    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:10.062779    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:10.560397    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:10.560424    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:10.560435    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:10.560441    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:10.564079    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:11.060679    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:11.060705    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:11.060716    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:11.060722    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:11.064114    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:11.559466    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:11.559492    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:11.559503    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:11.559567    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:11.562752    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:11.563402    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:12.059348    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:12.059373    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:12.059384    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:12.059389    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:12.062810    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:12.561048    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:12.561106    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:12.561120    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:12.561141    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:12.564459    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:13.059831    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:13.059855    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.059867    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.059873    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.063079    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:13.063582    2876 node_ready.go:49] node "ha-949000-m03" has status "Ready":"True"
	I0831 15:32:13.063594    2876 node_ready.go:38] duration metric: took 19.504599366s for node "ha-949000-m03" to be "Ready" ...
	I0831 15:32:13.063602    2876 pod_ready.go:36] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:32:13.063657    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:32:13.063665    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.063674    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.063682    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.067458    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:13.072324    2876 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.072373    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:32:13.072379    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.072385    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.072389    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.074327    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.074802    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:13.074810    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.074815    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.074820    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.076654    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.076987    2876 pod_ready.go:93] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.076996    2876 pod_ready.go:82] duration metric: took 4.661444ms for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.077003    2876 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.077041    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-snq8s
	I0831 15:32:13.077046    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.077052    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.077056    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.078862    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.079264    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:13.079271    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.079277    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.079280    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.081027    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.081326    2876 pod_ready.go:93] pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.081335    2876 pod_ready.go:82] duration metric: took 4.326858ms for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.081342    2876 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.081372    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000
	I0831 15:32:13.081379    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.081385    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.081388    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.083263    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.083632    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:13.083639    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.083645    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.083649    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.085181    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.085480    2876 pod_ready.go:93] pod "etcd-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.085490    2876 pod_ready.go:82] duration metric: took 4.142531ms for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.085497    2876 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.085526    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m02
	I0831 15:32:13.085531    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.085537    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.085541    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.087128    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.087501    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:13.087508    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.087513    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.087518    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.088959    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.089244    2876 pod_ready.go:93] pod "etcd-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.089252    2876 pod_ready.go:82] duration metric: took 3.751049ms for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.089258    2876 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.261887    2876 request.go:632] Waited for 172.592535ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m03
	I0831 15:32:13.261972    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m03
	I0831 15:32:13.261978    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.262019    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.262028    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.264296    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:13.460589    2876 request.go:632] Waited for 195.842812ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:13.460724    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:13.460735    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.460745    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.460759    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.463962    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:13.464378    2876 pod_ready.go:93] pod "etcd-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.464391    2876 pod_ready.go:82] duration metric: took 375.12348ms for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.464404    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.661862    2876 request.go:632] Waited for 197.406518ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:32:13.661977    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:32:13.661988    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.661999    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.662005    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.665393    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:13.861181    2876 request.go:632] Waited for 195.385788ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:13.861214    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:13.861218    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.861225    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.861260    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.863261    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.863567    2876 pod_ready.go:93] pod "kube-apiserver-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.863577    2876 pod_ready.go:82] duration metric: took 399.161484ms for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.863584    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:14.061861    2876 request.go:632] Waited for 198.232413ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:32:14.061952    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:32:14.061961    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:14.061972    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:14.061979    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:14.064530    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:14.260004    2876 request.go:632] Waited for 194.98208ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:14.260143    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:14.260166    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:14.260182    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:14.260227    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:14.266580    2876 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:32:14.266908    2876 pod_ready.go:93] pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:14.266927    2876 pod_ready.go:82] duration metric: took 403.325368ms for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:14.266937    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:14.460025    2876 request.go:632] Waited for 193.045445ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:32:14.460093    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:32:14.460101    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:14.460110    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:14.460117    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:14.462588    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:14.660940    2876 request.go:632] Waited for 197.721547ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:14.661070    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:14.661080    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:14.661096    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:14.661109    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:14.664541    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:14.664954    2876 pod_ready.go:93] pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:14.664967    2876 pod_ready.go:82] duration metric: took 398.020825ms for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:14.664979    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:14.861147    2876 request.go:632] Waited for 196.115866ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:32:14.861203    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:32:14.861211    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:14.861223    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:14.861231    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:14.864847    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:15.060912    2876 request.go:632] Waited for 195.310518ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:15.060968    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:15.060983    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:15.061000    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:15.061011    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:15.064271    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:15.064583    2876 pod_ready.go:93] pod "kube-controller-manager-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:15.064594    2876 pod_ready.go:82] duration metric: took 399.604845ms for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:15.064603    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:15.260515    2876 request.go:632] Waited for 195.841074ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:32:15.260662    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:32:15.260676    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:15.260688    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:15.260702    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:15.264411    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:15.461372    2876 request.go:632] Waited for 196.432681ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:15.461470    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:15.461484    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:15.461502    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:15.461513    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:15.464382    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:15.464683    2876 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:15.464691    2876 pod_ready.go:82] duration metric: took 400.078711ms for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:15.464700    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:15.660288    2876 request.go:632] Waited for 195.551444ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:32:15.660318    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:32:15.660323    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:15.660357    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:15.660363    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:15.663247    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:15.860473    2876 request.go:632] Waited for 196.823661ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:15.860532    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:15.860542    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:15.860556    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:15.860563    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:15.863954    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:15.864333    2876 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:15.864346    2876 pod_ready.go:82] duration metric: took 399.636293ms for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:15.864355    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:16.060306    2876 request.go:632] Waited for 195.900703ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:32:16.060410    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:32:16.060437    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:16.060449    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:16.060455    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:16.063745    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:16.260402    2876 request.go:632] Waited for 195.997957ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:16.260523    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:16.260539    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:16.260551    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:16.260563    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:16.264052    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:16.264373    2876 pod_ready.go:93] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:16.264385    2876 pod_ready.go:82] duration metric: took 400.01997ms for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:16.264394    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:16.461128    2876 request.go:632] Waited for 196.682855ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-d45q5
	I0831 15:32:16.461251    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-d45q5
	I0831 15:32:16.461264    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:16.461275    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:16.461282    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:16.464602    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:16.660248    2876 request.go:632] Waited for 195.08291ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:16.660298    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:16.660310    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:16.660327    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:16.660340    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:16.663471    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:16.664017    2876 pod_ready.go:93] pod "kube-proxy-d45q5" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:16.664029    2876 pod_ready.go:82] duration metric: took 399.623986ms for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:16.664038    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:16.859948    2876 request.go:632] Waited for 195.845325ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:32:16.860034    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:32:16.860060    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:16.860083    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:16.860094    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:16.863263    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:17.060250    2876 request.go:632] Waited for 196.410574ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:17.060307    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:17.060319    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:17.060334    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:17.060345    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:17.063664    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:17.064113    2876 pod_ready.go:93] pod "kube-proxy-q7ndn" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:17.064125    2876 pod_ready.go:82] duration metric: took 400.076522ms for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:17.064134    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:17.260150    2876 request.go:632] Waited for 195.935266ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:32:17.260232    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:32:17.260246    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:17.260305    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:17.260324    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:17.263756    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:17.460703    2876 request.go:632] Waited for 196.426241ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:17.460753    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:17.460765    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:17.460776    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:17.460799    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:17.463925    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:17.464439    2876 pod_ready.go:93] pod "kube-scheduler-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:17.464449    2876 pod_ready.go:82] duration metric: took 400.306164ms for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:17.464463    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:17.660506    2876 request.go:632] Waited for 196.00354ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:32:17.660541    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:32:17.660547    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:17.660553    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:17.660568    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:17.662504    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:17.859973    2876 request.go:632] Waited for 197.106962ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:17.860023    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:17.860031    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:17.860084    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:17.860092    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:17.869330    2876 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0831 15:32:17.869629    2876 pod_ready.go:93] pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:17.869638    2876 pod_ready.go:82] duration metric: took 405.16449ms for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:17.869646    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:18.060370    2876 request.go:632] Waited for 190.671952ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:32:18.060479    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:32:18.060492    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.060504    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.060511    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.063196    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:18.260902    2876 request.go:632] Waited for 197.387182ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:18.260947    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:18.260955    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.260976    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.261000    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.263780    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:18.264154    2876 pod_ready.go:93] pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:18.264163    2876 pod_ready.go:82] duration metric: took 394.508983ms for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:18.264171    2876 pod_ready.go:39] duration metric: took 5.200505122s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:32:18.264182    2876 api_server.go:52] waiting for apiserver process to appear ...
	I0831 15:32:18.264235    2876 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:32:18.276016    2876 api_server.go:72] duration metric: took 25.102905505s to wait for apiserver process to appear ...
	I0831 15:32:18.276029    2876 api_server.go:88] waiting for apiserver healthz status ...
	I0831 15:32:18.276040    2876 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0831 15:32:18.280474    2876 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0831 15:32:18.280519    2876 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0831 15:32:18.280525    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.280531    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.280535    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.281148    2876 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 15:32:18.281176    2876 api_server.go:141] control plane version: v1.31.0
	I0831 15:32:18.281184    2876 api_server.go:131] duration metric: took 5.150155ms to wait for apiserver health ...
	I0831 15:32:18.281189    2876 system_pods.go:43] waiting for kube-system pods to appear ...
	I0831 15:32:18.460471    2876 request.go:632] Waited for 179.236076ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:32:18.460573    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:32:18.460585    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.460596    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.460604    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.465317    2876 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:32:18.469906    2876 system_pods.go:59] 24 kube-system pods found
	I0831 15:32:18.469918    2876 system_pods.go:61] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:32:18.469921    2876 system_pods.go:61] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:32:18.469925    2876 system_pods.go:61] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:32:18.469928    2876 system_pods.go:61] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:32:18.469933    2876 system_pods.go:61] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:32:18.469937    2876 system_pods.go:61] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:32:18.469939    2876 system_pods.go:61] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:32:18.469943    2876 system_pods.go:61] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:32:18.469946    2876 system_pods.go:61] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:32:18.469949    2876 system_pods.go:61] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:32:18.469954    2876 system_pods.go:61] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:32:18.469958    2876 system_pods.go:61] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:32:18.469961    2876 system_pods.go:61] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:32:18.469963    2876 system_pods.go:61] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:32:18.469966    2876 system_pods.go:61] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:32:18.469969    2876 system_pods.go:61] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:32:18.469972    2876 system_pods.go:61] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:32:18.469975    2876 system_pods.go:61] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:32:18.469978    2876 system_pods.go:61] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:32:18.469980    2876 system_pods.go:61] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:32:18.469983    2876 system_pods.go:61] "kube-vip-ha-949000" [933b8e54-299e-44c1-8dea-69aba92adbd4] Running
	I0831 15:32:18.469985    2876 system_pods.go:61] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:32:18.469988    2876 system_pods.go:61] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:32:18.469990    2876 system_pods.go:61] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:32:18.469994    2876 system_pods.go:74] duration metric: took 188.799972ms to wait for pod list to return data ...
	I0831 15:32:18.470000    2876 default_sa.go:34] waiting for default service account to be created ...
	I0831 15:32:18.659945    2876 request.go:632] Waited for 189.894855ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:32:18.659986    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:32:18.660002    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.660011    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.660017    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.662843    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:18.662901    2876 default_sa.go:45] found service account: "default"
	I0831 15:32:18.662910    2876 default_sa.go:55] duration metric: took 192.903479ms for default service account to be created ...
	I0831 15:32:18.662915    2876 system_pods.go:116] waiting for k8s-apps to be running ...
	I0831 15:32:18.860267    2876 request.go:632] Waited for 197.296928ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:32:18.860299    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:32:18.860304    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.860310    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.860316    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.864052    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:18.868873    2876 system_pods.go:86] 24 kube-system pods found
	I0831 15:32:18.868886    2876 system_pods.go:89] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:32:18.868891    2876 system_pods.go:89] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:32:18.868894    2876 system_pods.go:89] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:32:18.868897    2876 system_pods.go:89] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:32:18.868901    2876 system_pods.go:89] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:32:18.868904    2876 system_pods.go:89] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:32:18.868907    2876 system_pods.go:89] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:32:18.868912    2876 system_pods.go:89] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:32:18.868916    2876 system_pods.go:89] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:32:18.868918    2876 system_pods.go:89] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:32:18.868922    2876 system_pods.go:89] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:32:18.868927    2876 system_pods.go:89] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:32:18.868931    2876 system_pods.go:89] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:32:18.868934    2876 system_pods.go:89] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:32:18.868938    2876 system_pods.go:89] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:32:18.868941    2876 system_pods.go:89] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:32:18.868944    2876 system_pods.go:89] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:32:18.868947    2876 system_pods.go:89] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:32:18.868950    2876 system_pods.go:89] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:32:18.868953    2876 system_pods.go:89] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:32:18.868957    2876 system_pods.go:89] "kube-vip-ha-949000" [933b8e54-299e-44c1-8dea-69aba92adbd4] Running
	I0831 15:32:18.868959    2876 system_pods.go:89] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:32:18.868963    2876 system_pods.go:89] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:32:18.868966    2876 system_pods.go:89] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:32:18.868971    2876 system_pods.go:126] duration metric: took 206.049826ms to wait for k8s-apps to be running ...
	I0831 15:32:18.868980    2876 system_svc.go:44] waiting for kubelet service to be running ....
	I0831 15:32:18.869030    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:32:18.880958    2876 system_svc.go:56] duration metric: took 11.976044ms WaitForService to wait for kubelet
	I0831 15:32:18.880978    2876 kubeadm.go:582] duration metric: took 25.707859659s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:32:18.880990    2876 node_conditions.go:102] verifying NodePressure condition ...
	I0831 15:32:19.060320    2876 request.go:632] Waited for 179.26426ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0831 15:32:19.060365    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0831 15:32:19.060371    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:19.060379    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:19.060385    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:19.063168    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:19.063767    2876 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:32:19.063776    2876 node_conditions.go:123] node cpu capacity is 2
	I0831 15:32:19.063782    2876 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:32:19.063785    2876 node_conditions.go:123] node cpu capacity is 2
	I0831 15:32:19.063789    2876 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:32:19.063791    2876 node_conditions.go:123] node cpu capacity is 2
	I0831 15:32:19.063794    2876 node_conditions.go:105] duration metric: took 182.798166ms to run NodePressure ...
	I0831 15:32:19.063802    2876 start.go:241] waiting for startup goroutines ...
	I0831 15:32:19.063817    2876 start.go:255] writing updated cluster config ...
	I0831 15:32:19.064186    2876 ssh_runner.go:195] Run: rm -f paused
	I0831 15:32:19.107477    2876 start.go:600] kubectl: 1.29.2, cluster: 1.31.0 (minor skew: 2)
	I0831 15:32:19.128559    2876 out.go:201] 
	W0831 15:32:19.149451    2876 out.go:270] ! /usr/local/bin/kubectl is version 1.29.2, which may have incompatibilities with Kubernetes 1.31.0.
	I0831 15:32:19.170407    2876 out.go:177]   - Want kubectl v1.31.0? Try 'minikube kubectl -- get pods -A'
	I0831 15:32:19.212551    2876 out.go:177] * Done! kubectl is now configured to use "ha-949000" cluster and "default" namespace by default
	
	
	==> Docker <==
	Aug 31 22:30:08 ha-949000 cri-dockerd[1172]: time="2024-08-31T22:30:08Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/7da75377db13c80b27b99ccc9f52561a4408675361947cf393e0c38286a71997/resolv.conf as [nameserver 192.169.0.1]"
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.201910840Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.202112013Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.202132705Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.202328611Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:30:08 ha-949000 cri-dockerd[1172]: time="2024-08-31T22:30:08Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/1017bd5eac1d26de2df318c0dc0ac8d5db92d72e8c268401502a145b3ad0d9d8/resolv.conf as [nameserver 192.169.0.1]"
	Aug 31 22:30:08 ha-949000 cri-dockerd[1172]: time="2024-08-31T22:30:08Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/271da20951c9ab4102e979dc2b97b3a9c8d992db5fc7ebac3f954ea9edee9d48/resolv.conf as [nameserver 192.169.0.1]"
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.346950244Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.347136993Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.347223771Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.347348772Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.379063396Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.379210402Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.379226413Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.379336044Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:32:21 ha-949000 dockerd[1279]: time="2024-08-31T22:32:21.320619490Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:32:21 ha-949000 dockerd[1279]: time="2024-08-31T22:32:21.320945499Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:32:21 ha-949000 dockerd[1279]: time="2024-08-31T22:32:21.321018153Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:32:21 ha-949000 dockerd[1279]: time="2024-08-31T22:32:21.321131565Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:32:21 ha-949000 cri-dockerd[1172]: time="2024-08-31T22:32:21Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/f68483c946835415bfdf0531bfc6be41dd321162f4c19af555ece0f66ee7cabe/resolv.conf as [nameserver 10.96.0.10 search default.svc.cluster.local svc.cluster.local cluster.local options ndots:5]"
	Aug 31 22:32:22 ha-949000 cri-dockerd[1172]: time="2024-08-31T22:32:22Z" level=info msg="Stop pulling image gcr.io/k8s-minikube/busybox:1.28: Status: Downloaded newer image for gcr.io/k8s-minikube/busybox:1.28"
	Aug 31 22:32:22 ha-949000 dockerd[1279]: time="2024-08-31T22:32:22.716842379Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:32:22 ha-949000 dockerd[1279]: time="2024-08-31T22:32:22.716906766Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:32:22 ha-949000 dockerd[1279]: time="2024-08-31T22:32:22.716920530Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:32:22 ha-949000 dockerd[1279]: time="2024-08-31T22:32:22.721236974Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                 CREATED              STATE               NAME                      ATTEMPT             POD ID              POD
	2f925f16b74b0       gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12   About a minute ago   Running             busybox                   0                   f68483c946835       busybox-7dff88458-5kkbw
	b1db836cd7a3d       cbb01a7bd410d                                                                                         3 minutes ago        Running             coredns                   0                   271da20951c9a       coredns-6f6b679f8f-kjszm
	def4d6bd20bc5       cbb01a7bd410d                                                                                         3 minutes ago        Running             coredns                   0                   1017bd5eac1d2       coredns-6f6b679f8f-snq8s
	22fbb8a8e01ad       6e38f40d628db                                                                                         3 minutes ago        Running             storage-provisioner       0                   7da75377db13c       storage-provisioner
	6d156ce626115       kindest/kindnetd@sha256:e59a687ca28ae274a2fc92f1e2f5f1c739f353178a43a23aafc71adb802ed166              4 minutes ago        Running             kindnet-cni               0                   7d1851c17485c       kindnet-jzj42
	54d5f8041c89d       ad83b2ca7b09e                                                                                         4 minutes ago        Running             kube-proxy                0                   4b0198ac7dc52       kube-proxy-q7ndn
	c99fe831b20c1       ghcr.io/kube-vip/kube-vip@sha256:360f0c5d02322075cc80edb9e4e0d2171e941e55072184f1f902203fafc81d0f     4 minutes ago        Running             kube-vip                  0                   9ef7e0fa361d5       kube-vip-ha-949000
	c734c23a53082       2e96e5913fc06                                                                                         4 minutes ago        Running             etcd                      0                   7cfaf9f5d4dd4       etcd-ha-949000
	02c10e4f765d1       1766f54c897f0                                                                                         4 minutes ago        Running             kube-scheduler            0                   c084f2a259f6c       kube-scheduler-ha-949000
	6670fd34164cb       045733566833c                                                                                         4 minutes ago        Running             kube-controller-manager   0                   f9573e28f9d4d       kube-controller-manager-ha-949000
	ffec6106be6c8       604f5db92eaa8                                                                                         4 minutes ago        Running             kube-apiserver            0                   25c49852f78dc       kube-apiserver-ha-949000
	
	
	==> coredns [b1db836cd7a3] <==
	[INFO] 10.244.1.2:56414 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000107837s
	[INFO] 10.244.1.2:53184 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000079726s
	[INFO] 10.244.1.2:58757 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000418868s
	[INFO] 10.244.1.2:39299 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000067106s
	[INFO] 10.244.2.2:56948 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000080585s
	[INFO] 10.244.2.2:56973 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000078985s
	[INFO] 10.244.2.2:43081 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000100123s
	[INFO] 10.244.2.2:56390 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000040214s
	[INFO] 10.244.2.2:52519 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000061255s
	[INFO] 10.244.0.4:36226 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000151133s
	[INFO] 10.244.1.2:44017 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089111s
	[INFO] 10.244.1.2:37224 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000069144s
	[INFO] 10.244.1.2:51282 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000118723s
	[INFO] 10.244.2.2:35009 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089507s
	[INFO] 10.244.2.2:60607 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000049176s
	[INFO] 10.244.2.2:36851 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000097758s
	[INFO] 10.244.0.4:59717 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000053986s
	[INFO] 10.244.0.4:58447 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000060419s
	[INFO] 10.244.1.2:60381 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000136898s
	[INFO] 10.244.1.2:32783 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.00010303s
	[INFO] 10.244.1.2:44904 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000042493s
	[INFO] 10.244.1.2:44085 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000132084s
	[INFO] 10.244.2.2:43635 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000080947s
	[INFO] 10.244.2.2:40020 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000081919s
	[INFO] 10.244.2.2:53730 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000058015s
	
	
	==> coredns [def4d6bd20bc] <==
	[INFO] 10.244.0.4:41865 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 140 0.008744161s
	[INFO] 10.244.1.2:50080 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000093199s
	[INFO] 10.244.1.2:55576 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.000574417s
	[INFO] 10.244.1.2:36293 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000065455s
	[INFO] 10.244.2.2:41223 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000063892s
	[INFO] 10.244.0.4:54135 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000096141s
	[INFO] 10.244.0.4:39176 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000742646s
	[INFO] 10.244.0.4:58445 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000080113s
	[INFO] 10.244.0.4:56242 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000066269s
	[INFO] 10.244.0.4:60657 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000049645s
	[INFO] 10.244.1.2:48306 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000561931s
	[INFO] 10.244.1.2:40767 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000077826s
	[INFO] 10.244.1.2:35669 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000056994s
	[INFO] 10.244.1.2:57720 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000040565s
	[INFO] 10.244.2.2:38794 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000136901s
	[INFO] 10.244.2.2:33576 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000052374s
	[INFO] 10.244.2.2:57053 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000051289s
	[INFO] 10.244.0.4:47623 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000056903s
	[INFO] 10.244.0.4:59818 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.00003011s
	[INFO] 10.244.0.4:53586 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000029565s
	[INFO] 10.244.1.2:60045 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000060878s
	[INFO] 10.244.2.2:38400 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000078624s
	[INFO] 10.244.0.4:58765 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000075707s
	[INFO] 10.244.0.4:32804 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000050785s
	[INFO] 10.244.2.2:48459 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.00007773s
	
	
	==> describe nodes <==
	Name:               ha-949000
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-949000
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=ha-949000
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_08_31T15_29_45_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:29:41 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-949000
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:33:58 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:32:48 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:32:48 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:32:48 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:32:48 +0000   Sat, 31 Aug 2024 22:30:07 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.5
	  Hostname:    ha-949000
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 e8535f0b09e14aea8b2456a9d977fc80
	  System UUID:                98ca49d1-0000-0000-9e6c-321a4533d56e
	  Boot ID:                    4896b77b-e0f4-43c0-af0e-3998b4352bec
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-5kkbw              0 (0%)        0 (0%)      0 (0%)           0 (0%)         100s
	  kube-system                 coredns-6f6b679f8f-kjszm             100m (5%)     0 (0%)      70Mi (3%)        170Mi (8%)     4m11s
	  kube-system                 coredns-6f6b679f8f-snq8s             100m (5%)     0 (0%)      70Mi (3%)        170Mi (8%)     4m11s
	  kube-system                 etcd-ha-949000                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         4m16s
	  kube-system                 kindnet-jzj42                        100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      4m12s
	  kube-system                 kube-apiserver-ha-949000             250m (12%)    0 (0%)      0 (0%)           0 (0%)         4m17s
	  kube-system                 kube-controller-manager-ha-949000    200m (10%)    0 (0%)      0 (0%)           0 (0%)         4m16s
	  kube-system                 kube-proxy-q7ndn                     0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m12s
	  kube-system                 kube-scheduler-ha-949000             100m (5%)     0 (0%)      0 (0%)           0 (0%)         4m18s
	  kube-system                 kube-vip-ha-949000                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m18s
	  kube-system                 storage-provisioner                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m12s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%)   100m (5%)
	  memory             290Mi (13%)  390Mi (18%)
	  ephemeral-storage  0 (0%)       0 (0%)
	  hugepages-2Mi      0 (0%)       0 (0%)
	Events:
	  Type    Reason                   Age    From             Message
	  ----    ------                   ----   ----             -------
	  Normal  Starting                 4m10s  kube-proxy       
	  Normal  Starting                 4m16s  kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  4m16s  kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  4m16s  kubelet          Node ha-949000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    4m16s  kubelet          Node ha-949000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     4m16s  kubelet          Node ha-949000 status is now: NodeHasSufficientPID
	  Normal  RegisteredNode           4m12s  node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  NodeReady                3m53s  kubelet          Node ha-949000 status is now: NodeReady
	  Normal  RegisteredNode           3m12s  node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           2m2s   node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	
	
	Name:               ha-949000-m02
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-949000-m02
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=ha-949000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_08_31T15_30_43_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:30:41 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-949000-m02
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:33:44 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:32:43 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:32:43 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:32:43 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:32:43 +0000   Sat, 31 Aug 2024 22:31:00 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.6
	  Hostname:    ha-949000-m02
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 31d5d81c627e4d65bfa15e4c54f7f7c1
	  System UUID:                23e54f3d-0000-0000-86b7-b25c818528d1
	  Boot ID:                    021c5fd3-b441-490e-ac27-d927c00459f2
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-6r9s5                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         100s
	  kube-system                 etcd-ha-949000-m02                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         3m17s
	  kube-system                 kindnet-brtj6                            100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      3m19s
	  kube-system                 kube-apiserver-ha-949000-m02             250m (12%)    0 (0%)      0 (0%)           0 (0%)         3m17s
	  kube-system                 kube-controller-manager-ha-949000-m02    200m (10%)    0 (0%)      0 (0%)           0 (0%)         3m14s
	  kube-system                 kube-proxy-4r2bt                         0 (0%)        0 (0%)      0 (0%)           0 (0%)         3m19s
	  kube-system                 kube-scheduler-ha-949000-m02             100m (5%)     0 (0%)      0 (0%)           0 (0%)         3m13s
	  kube-system                 kube-vip-ha-949000-m02                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         3m15s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%)  100m (5%)
	  memory             150Mi (7%)  50Mi (2%)
	  ephemeral-storage  0 (0%)      0 (0%)
	  hugepages-2Mi      0 (0%)      0 (0%)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 3m15s                  kube-proxy       
	  Normal  NodeHasSufficientMemory  3m19s (x8 over 3m19s)  kubelet          Node ha-949000-m02 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    3m19s (x8 over 3m19s)  kubelet          Node ha-949000-m02 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     3m19s (x7 over 3m19s)  kubelet          Node ha-949000-m02 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  3m19s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           3m17s                  node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal  RegisteredNode           3m12s                  node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal  RegisteredNode           2m2s                   node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	
	
	Name:               ha-949000-m03
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-949000-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=ha-949000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_08_31T15_31_53_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:31:50 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-949000-m03
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:33:52 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:32:52 +0000   Sat, 31 Aug 2024 22:31:50 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:32:52 +0000   Sat, 31 Aug 2024 22:31:50 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:32:52 +0000   Sat, 31 Aug 2024 22:31:50 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:32:52 +0000   Sat, 31 Aug 2024 22:32:13 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.7
	  Hostname:    ha-949000-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 0aea5b50957a40edad0152e71b7f3a2a
	  System UUID:                3fde4d5b-0000-0000-8412-6ae6e5c787bb
	  Boot ID:                    2d4c31ca-c268-4eb4-ad45-716d78aaaa5c
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.2.0/24
	PodCIDRs:                     10.244.2.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-vjf9x                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         100s
	  kube-system                 etcd-ha-949000-m03                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         2m7s
	  kube-system                 kindnet-9j85v                            100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      2m10s
	  kube-system                 kube-apiserver-ha-949000-m03             250m (12%)    0 (0%)      0 (0%)           0 (0%)         2m7s
	  kube-system                 kube-controller-manager-ha-949000-m03    200m (10%)    0 (0%)      0 (0%)           0 (0%)         2m9s
	  kube-system                 kube-proxy-d45q5                         0 (0%)        0 (0%)      0 (0%)           0 (0%)         2m10s
	  kube-system                 kube-scheduler-ha-949000-m03             100m (5%)     0 (0%)      0 (0%)           0 (0%)         2m9s
	  kube-system                 kube-vip-ha-949000-m03                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         2m6s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%)  100m (5%)
	  memory             150Mi (7%)  50Mi (2%)
	  ephemeral-storage  0 (0%)      0 (0%)
	  hugepages-2Mi      0 (0%)      0 (0%)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 2m6s                   kube-proxy       
	  Normal  NodeHasSufficientMemory  2m10s (x8 over 2m10s)  kubelet          Node ha-949000-m03 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    2m10s (x8 over 2m10s)  kubelet          Node ha-949000-m03 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     2m10s (x7 over 2m10s)  kubelet          Node ha-949000-m03 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  2m10s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           2m7s                   node-controller  Node ha-949000-m03 event: Registered Node ha-949000-m03 in Controller
	  Normal  RegisteredNode           2m7s                   node-controller  Node ha-949000-m03 event: Registered Node ha-949000-m03 in Controller
	  Normal  RegisteredNode           2m2s                   node-controller  Node ha-949000-m03 event: Registered Node ha-949000-m03 in Controller
	
	
	==> dmesg <==
	[  +2.774485] systemd-fstab-generator[127]: Ignoring "noauto" option for root device
	[  +2.237441] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000003] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000001] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +0.596627] systemd-fstab-generator[494]: Ignoring "noauto" option for root device
	[  +0.090743] systemd-fstab-generator[506]: Ignoring "noauto" option for root device
	[  +1.756564] systemd-fstab-generator[845]: Ignoring "noauto" option for root device
	[  +0.273405] systemd-fstab-generator[883]: Ignoring "noauto" option for root device
	[  +0.102089] systemd-fstab-generator[895]: Ignoring "noauto" option for root device
	[  +0.058959] kauditd_printk_skb: 115 callbacks suppressed
	[  +0.059797] systemd-fstab-generator[909]: Ignoring "noauto" option for root device
	[  +2.526421] systemd-fstab-generator[1125]: Ignoring "noauto" option for root device
	[  +0.100331] systemd-fstab-generator[1137]: Ignoring "noauto" option for root device
	[  +0.099114] systemd-fstab-generator[1149]: Ignoring "noauto" option for root device
	[  +0.141519] systemd-fstab-generator[1164]: Ignoring "noauto" option for root device
	[  +3.497423] systemd-fstab-generator[1265]: Ignoring "noauto" option for root device
	[  +0.066902] kauditd_printk_skb: 158 callbacks suppressed
	[  +2.572406] systemd-fstab-generator[1521]: Ignoring "noauto" option for root device
	[  +3.569896] systemd-fstab-generator[1651]: Ignoring "noauto" option for root device
	[  +0.054418] kauditd_printk_skb: 70 callbacks suppressed
	[  +7.004094] systemd-fstab-generator[2150]: Ignoring "noauto" option for root device
	[  +0.086539] kauditd_printk_skb: 72 callbacks suppressed
	[  +5.400345] kauditd_printk_skb: 12 callbacks suppressed
	[  +5.311598] kauditd_printk_skb: 29 callbacks suppressed
	[Aug31 22:30] kauditd_printk_skb: 26 callbacks suppressed
	
	
	==> etcd [c734c23a5308] <==
	{"level":"warn","ts":"2024-08-31T22:34:00.596231Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:00.617951Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:00.649594Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:00.718367Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:00.745521Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:00.818734Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:00.918605Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:00.927057Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:00.932880Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:00.935227Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:00.943254Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:00.947106Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:00.950470Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:00.953728Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:00.955927Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:00.960055Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:00.963093Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:00.966127Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:00.970328Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:00.972527Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:00.978084Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:00.981071Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:00.984477Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:01.018615Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:01.118803Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	
	
	==> kernel <==
	 22:34:01 up 4 min,  0 users,  load average: 0.36, 0.22, 0.10
	Linux ha-949000 5.10.207 #1 SMP Wed Aug 28 20:54:17 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [6d156ce62611] <==
	I0831 22:33:15.620783       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:33:25.614304       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:33:25.614589       1 main.go:299] handling current node
	I0831 22:33:25.614804       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:33:25.615060       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:33:25.615515       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:33:25.615641       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:33:35.620070       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:33:35.620108       1 main.go:299] handling current node
	I0831 22:33:35.620119       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:33:35.620124       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:33:35.620269       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:33:35.620297       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:33:45.620982       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:33:45.621246       1 main.go:299] handling current node
	I0831 22:33:45.621372       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:33:45.621475       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:33:45.621703       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:33:45.621934       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:33:55.614391       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:33:55.614479       1 main.go:299] handling current node
	I0831 22:33:55.614598       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:33:55.614749       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:33:55.615255       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:33:55.615757       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	
	
	==> kube-apiserver [ffec6106be6c] <==
	I0831 22:29:42.351464       1 controller.go:615] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0831 22:29:42.447047       1 alloc.go:330] "allocated clusterIPs" service="default/kubernetes" clusterIPs={"IPv4":"10.96.0.1"}
	W0831 22:29:42.450860       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.169.0.5]
	I0831 22:29:42.451599       1 controller.go:615] quota admission added evaluator for: endpoints
	I0831 22:29:42.454145       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0831 22:29:43.117776       1 controller.go:615] quota admission added evaluator for: serviceaccounts
	I0831 22:29:44.628868       1 controller.go:615] quota admission added evaluator for: deployments.apps
	I0831 22:29:44.643482       1 alloc.go:330] "allocated clusterIPs" service="kube-system/kube-dns" clusterIPs={"IPv4":"10.96.0.10"}
	I0831 22:29:44.649286       1 controller.go:615] quota admission added evaluator for: daemonsets.apps
	I0831 22:29:48.568363       1 controller.go:615] quota admission added evaluator for: replicasets.apps
	I0831 22:29:48.768446       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	E0831 22:32:24.583976       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51190: use of closed network connection
	E0831 22:32:24.787019       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51192: use of closed network connection
	E0831 22:32:24.994355       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51194: use of closed network connection
	E0831 22:32:25.183977       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51196: use of closed network connection
	E0831 22:32:25.381277       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51198: use of closed network connection
	E0831 22:32:25.569952       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51200: use of closed network connection
	E0831 22:32:25.763008       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51202: use of closed network connection
	E0831 22:32:25.965367       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51204: use of closed network connection
	E0831 22:32:26.154701       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51206: use of closed network connection
	E0831 22:32:26.694309       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51211: use of closed network connection
	E0831 22:32:26.880399       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51213: use of closed network connection
	E0831 22:32:27.077320       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51215: use of closed network connection
	E0831 22:32:27.267610       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51217: use of closed network connection
	E0831 22:32:27.476005       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51219: use of closed network connection
	
	
	==> kube-controller-manager [6670fd34164c] <==
	I0831 22:31:58.309145       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:31:58.363553       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:32:00.655864       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:32:13.090917       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:32:13.100697       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:32:13.164123       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:32:20.074086       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="91.437594ms"
	I0831 22:32:20.089117       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="14.696904ms"
	I0831 22:32:20.155832       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="66.417676ms"
	I0831 22:32:20.247938       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="91.617712ms"
	E0831 22:32:20.248480       1 replica_set.go:560] "Unhandled Error" err="sync \"default/busybox-7dff88458\" failed with Operation cannot be fulfilled on replicasets.apps \"busybox-7dff88458\": the object has been modified; please apply your changes to the latest version and try again" logger="UnhandledError"
	I0831 22:32:20.257744       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="7.890782ms"
	I0831 22:32:20.258053       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="29.491µs"
	I0831 22:32:20.352807       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="29.639µs"
	I0831 22:32:21.164054       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:32:21.310383       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="34.795µs"
	I0831 22:32:22.115926       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="5.066721ms"
	I0831 22:32:22.116004       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="26.449µs"
	I0831 22:32:23.502335       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="6.289855ms"
	I0831 22:32:23.502432       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="58.061µs"
	I0831 22:32:24.043757       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="4.626106ms"
	I0831 22:32:24.044703       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="46.785µs"
	I0831 22:32:44.005602       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m02"
	I0831 22:32:48.178405       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000"
	I0831 22:32:52.115444       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	
	
	==> kube-proxy [54d5f8041c89] <==
		add table ip kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	E0831 22:29:49.977338       1 proxier.go:734] "Error cleaning up nftables rules" err=<
		could not run nftables command: /dev/stdin:1:1-25: Error: Could not process rule: Operation not supported
		add table ip6 kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	I0831 22:29:49.983071       1 server.go:677] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	E0831 22:29:49.983430       1 server.go:234] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I0831 22:29:50.023032       1 server_linux.go:146] "No iptables support for family" ipFamily="IPv6"
	I0831 22:29:50.023054       1 server.go:245] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0831 22:29:50.023070       1 server_linux.go:169] "Using iptables Proxier"
	I0831 22:29:50.025790       1 proxier.go:255] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I0831 22:29:50.026014       1 server.go:483] "Version info" version="v1.31.0"
	I0831 22:29:50.026061       1 server.go:485] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:29:50.026844       1 config.go:197] "Starting service config controller"
	I0831 22:29:50.027602       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0831 22:29:50.027141       1 config.go:104] "Starting endpoint slice config controller"
	I0831 22:29:50.027698       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0831 22:29:50.027260       1 config.go:326] "Starting node config controller"
	I0831 22:29:50.027720       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0831 22:29:50.128122       1 shared_informer.go:320] Caches are synced for node config
	I0831 22:29:50.128144       1 shared_informer.go:320] Caches are synced for service config
	I0831 22:29:50.128162       1 shared_informer.go:320] Caches are synced for endpoint slice config
	
	
	==> kube-scheduler [02c10e4f765d] <==
	W0831 22:29:42.107023       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0831 22:29:42.107231       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicasets\" in API group \"apps\" at the cluster scope" logger="UnhandledError"
	W0831 22:29:42.111966       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	E0831 22:29:42.112045       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumes\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:29:42.116498       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	E0831 22:29:42.116539       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Namespace: failed to list *v1.Namespace: namespaces is forbidden: User \"system:kube-scheduler\" cannot list resource \"namespaces\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:29:42.129701       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0831 22:29:42.129741       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumeclaims\" in API group \"\" at the cluster scope" logger="UnhandledError"
	I0831 22:29:45.342252       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	E0831 22:31:50.464567       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kube-proxy-d45q5\": pod kube-proxy-d45q5 is already assigned to node \"ha-949000-m03\"" plugin="DefaultBinder" pod="kube-system/kube-proxy-d45q5" node="ha-949000-m03"
	E0831 22:31:50.464652       1 schedule_one.go:348] "scheduler cache ForgetPod failed" err="pod 9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2(kube-system/kube-proxy-d45q5) wasn't assumed so cannot be forgotten" pod="kube-system/kube-proxy-d45q5"
	E0831 22:31:50.464667       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kube-proxy-d45q5\": pod kube-proxy-d45q5 is already assigned to node \"ha-949000-m03\"" pod="kube-system/kube-proxy-d45q5"
	I0831 22:31:50.464683       1 schedule_one.go:1070] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kube-proxy-d45q5" node="ha-949000-m03"
	E0831 22:31:50.476710       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kindnet-l4zbh\": pod kindnet-l4zbh is already assigned to node \"ha-949000-m03\"" plugin="DefaultBinder" pod="kube-system/kindnet-l4zbh" node="ha-949000-m03"
	E0831 22:31:50.476756       1 schedule_one.go:348] "scheduler cache ForgetPod failed" err="pod c551bb18-9a7d-4fca-9724-be7900980a40(kube-system/kindnet-l4zbh) wasn't assumed so cannot be forgotten" pod="kube-system/kindnet-l4zbh"
	E0831 22:31:50.476767       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kindnet-l4zbh\": pod kindnet-l4zbh is already assigned to node \"ha-949000-m03\"" pod="kube-system/kindnet-l4zbh"
	I0831 22:31:50.476781       1 schedule_one.go:1070] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kindnet-l4zbh" node="ha-949000-m03"
	E0831 22:32:20.049491       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-6r9s5\": pod busybox-7dff88458-6r9s5 is already assigned to node \"ha-949000-m02\"" plugin="DefaultBinder" pod="default/busybox-7dff88458-6r9s5" node="ha-949000-m02"
	E0831 22:32:20.049618       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-6r9s5\": pod busybox-7dff88458-6r9s5 is already assigned to node \"ha-949000-m02\"" pod="default/busybox-7dff88458-6r9s5"
	E0831 22:32:20.071235       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-vjf9x\": pod busybox-7dff88458-vjf9x is already assigned to node \"ha-949000-m03\"" plugin="DefaultBinder" pod="default/busybox-7dff88458-vjf9x" node="ha-949000-m03"
	E0831 22:32:20.071466       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-vjf9x\": pod busybox-7dff88458-vjf9x is already assigned to node \"ha-949000-m03\"" pod="default/busybox-7dff88458-vjf9x"
	E0831 22:32:20.073498       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-5kkbw\": pod busybox-7dff88458-5kkbw is already assigned to node \"ha-949000\"" plugin="DefaultBinder" pod="default/busybox-7dff88458-5kkbw" node="ha-949000"
	E0831 22:32:20.073571       1 schedule_one.go:348] "scheduler cache ForgetPod failed" err="pod e97e21d8-a69e-451c-babd-6232e12aafe0(default/busybox-7dff88458-5kkbw) wasn't assumed so cannot be forgotten" pod="default/busybox-7dff88458-5kkbw"
	E0831 22:32:20.077323       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-5kkbw\": pod busybox-7dff88458-5kkbw is already assigned to node \"ha-949000\"" pod="default/busybox-7dff88458-5kkbw"
	I0831 22:32:20.077394       1 schedule_one.go:1070] "Pod has been assigned to node. Abort adding it back to queue." pod="default/busybox-7dff88458-5kkbw" node="ha-949000"
	
	
	==> kubelet <==
	Aug 31 22:30:08 ha-949000 kubelet[2157]: I0831 22:30:08.742452    2157 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-snq8s" podStartSLOduration=19.742440453 podStartE2EDuration="19.742440453s" podCreationTimestamp="2024-08-31 22:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-08-31 22:30:08.742201936 +0000 UTC m=+24.362226027" watchObservedRunningTime="2024-08-31 22:30:08.742440453 +0000 UTC m=+24.362464538"
	Aug 31 22:30:08 ha-949000 kubelet[2157]: I0831 22:30:08.742651    2157 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/storage-provisioner" podStartSLOduration=20.742642621999998 podStartE2EDuration="20.742642622s" podCreationTimestamp="2024-08-31 22:29:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-08-31 22:30:08.732189424 +0000 UTC m=+24.352213514" watchObservedRunningTime="2024-08-31 22:30:08.742642622 +0000 UTC m=+24.362666707"
	Aug 31 22:30:44 ha-949000 kubelet[2157]: E0831 22:30:44.495173    2157 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:30:44 ha-949000 kubelet[2157]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:30:44 ha-949000 kubelet[2157]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:30:44 ha-949000 kubelet[2157]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:30:44 ha-949000 kubelet[2157]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:31:44 ha-949000 kubelet[2157]: E0831 22:31:44.490275    2157 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:31:44 ha-949000 kubelet[2157]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:31:44 ha-949000 kubelet[2157]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:31:44 ha-949000 kubelet[2157]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:31:44 ha-949000 kubelet[2157]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:32:20 ha-949000 kubelet[2157]: W0831 22:32:20.081132    2157 reflector.go:561] object-"default"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ha-949000" cannot list resource "configmaps" in API group "" in the namespace "default": no relationship found between node 'ha-949000' and this object
	Aug 31 22:32:20 ha-949000 kubelet[2157]: E0831 22:32:20.081252    2157 reflector.go:158] "Unhandled Error" err="object-\"default\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ha-949000\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"default\": no relationship found between node 'ha-949000' and this object" logger="UnhandledError"
	Aug 31 22:32:20 ha-949000 kubelet[2157]: I0831 22:32:20.223174    2157 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l95k\" (UniqueName: \"kubernetes.io/projected/e97e21d8-a69e-451c-babd-6232e12aafe0-kube-api-access-6l95k\") pod \"busybox-7dff88458-5kkbw\" (UID: \"e97e21d8-a69e-451c-babd-6232e12aafe0\") " pod="default/busybox-7dff88458-5kkbw"
	Aug 31 22:32:44 ha-949000 kubelet[2157]: E0831 22:32:44.489812    2157 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:32:44 ha-949000 kubelet[2157]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:32:44 ha-949000 kubelet[2157]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:32:44 ha-949000 kubelet[2157]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:32:44 ha-949000 kubelet[2157]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:33:44 ha-949000 kubelet[2157]: E0831 22:33:44.492393    2157 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:33:44 ha-949000 kubelet[2157]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:33:44 ha-949000 kubelet[2157]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:33:44 ha-949000 kubelet[2157]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:33:44 ha-949000 kubelet[2157]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:255: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p ha-949000 -n ha-949000
helpers_test.go:262: (dbg) Run:  kubectl --context ha-949000 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:286: <<< TestMultiControlPlane/serial/StopSecondaryNode FAILED: end of post-mortem logs <<<
helpers_test.go:287: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/StopSecondaryNode (11.66s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartSecondaryNode (93.43s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartSecondaryNode
ha_test.go:420: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 node start m02 -v=7 --alsologtostderr
E0831 15:34:14.663780    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:34:15.441139    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
ha_test.go:420: (dbg) Done: out/minikube-darwin-amd64 -p ha-949000 node start m02 -v=7 --alsologtostderr: (36.978751818s)
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr: exit status 2 (451.554263ms)

                                                
                                                
-- stdout --
	ha-949000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 15:34:39.772409    3547 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:34:39.772626    3547 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:34:39.772631    3547 out.go:358] Setting ErrFile to fd 2...
	I0831 15:34:39.772635    3547 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:34:39.772824    3547 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:34:39.773001    3547 out.go:352] Setting JSON to false
	I0831 15:34:39.773023    3547 mustload.go:65] Loading cluster: ha-949000
	I0831 15:34:39.773063    3547 notify.go:220] Checking for updates...
	I0831 15:34:39.773353    3547 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:34:39.773370    3547 status.go:255] checking status of ha-949000 ...
	I0831 15:34:39.773775    3547 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:39.773841    3547 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:39.782730    3547 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51457
	I0831 15:34:39.783103    3547 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:39.783523    3547 main.go:141] libmachine: Using API Version  1
	I0831 15:34:39.783535    3547 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:39.783742    3547 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:39.783847    3547 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:34:39.783936    3547 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:39.784006    3547 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:34:39.784974    3547 status.go:330] ha-949000 host status = "Running" (err=<nil>)
	I0831 15:34:39.784993    3547 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:34:39.785241    3547 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:39.785262    3547 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:39.793796    3547 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51459
	I0831 15:34:39.794126    3547 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:39.794481    3547 main.go:141] libmachine: Using API Version  1
	I0831 15:34:39.794513    3547 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:39.794747    3547 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:39.794865    3547 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:34:39.794955    3547 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:34:39.795207    3547 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:39.795231    3547 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:39.804256    3547 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51461
	I0831 15:34:39.804584    3547 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:39.804907    3547 main.go:141] libmachine: Using API Version  1
	I0831 15:34:39.804919    3547 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:39.805123    3547 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:39.805220    3547 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:34:39.805358    3547 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:39.805377    3547 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:34:39.805453    3547 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:34:39.805534    3547 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:34:39.805617    3547 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:34:39.805699    3547 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:34:39.844710    3547 ssh_runner.go:195] Run: systemctl --version
	I0831 15:34:39.849513    3547 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:39.861032    3547 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:34:39.861056    3547 api_server.go:166] Checking apiserver status ...
	I0831 15:34:39.861097    3547 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:34:39.874085    3547 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup
	W0831 15:34:39.881603    3547 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:34:39.881653    3547 ssh_runner.go:195] Run: ls
	I0831 15:34:39.884959    3547 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:34:39.889165    3547 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:34:39.889179    3547 status.go:422] ha-949000 apiserver status = Running (err=<nil>)
	I0831 15:34:39.889188    3547 status.go:257] ha-949000 status: &{Name:ha-949000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:34:39.889199    3547 status.go:255] checking status of ha-949000-m02 ...
	I0831 15:34:39.889471    3547 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:39.889497    3547 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:39.898057    3547 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51465
	I0831 15:34:39.898414    3547 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:39.898734    3547 main.go:141] libmachine: Using API Version  1
	I0831 15:34:39.898744    3547 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:39.898943    3547 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:39.899076    3547 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:34:39.899164    3547 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:39.899278    3547 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 3528
	I0831 15:34:39.900201    3547 status.go:330] ha-949000-m02 host status = "Running" (err=<nil>)
	I0831 15:34:39.900210    3547 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:34:39.900445    3547 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:39.900466    3547 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:39.909065    3547 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51467
	I0831 15:34:39.909392    3547 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:39.909737    3547 main.go:141] libmachine: Using API Version  1
	I0831 15:34:39.909753    3547 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:39.909975    3547 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:39.910080    3547 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:34:39.910174    3547 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:34:39.910461    3547 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:39.910483    3547 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:39.919076    3547 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51469
	I0831 15:34:39.919422    3547 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:39.919752    3547 main.go:141] libmachine: Using API Version  1
	I0831 15:34:39.919761    3547 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:39.919977    3547 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:39.920089    3547 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:34:39.920223    3547 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:39.920234    3547 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:34:39.920322    3547 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:34:39.920404    3547 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:34:39.920485    3547 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:34:39.920551    3547 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:34:39.950601    3547 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:39.962469    3547 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:34:39.962484    3547 api_server.go:166] Checking apiserver status ...
	I0831 15:34:39.962533    3547 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:34:39.974234    3547 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1981/cgroup
	W0831 15:34:39.982077    3547 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1981/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:34:39.982122    3547 ssh_runner.go:195] Run: ls
	I0831 15:34:39.985199    3547 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:34:39.988361    3547 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:34:39.988372    3547 status.go:422] ha-949000-m02 apiserver status = Running (err=<nil>)
	I0831 15:34:39.988380    3547 status.go:257] ha-949000-m02 status: &{Name:ha-949000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:34:39.988390    3547 status.go:255] checking status of ha-949000-m03 ...
	I0831 15:34:39.988650    3547 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:39.988675    3547 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:39.997405    3547 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51473
	I0831 15:34:39.997734    3547 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:39.998061    3547 main.go:141] libmachine: Using API Version  1
	I0831 15:34:39.998079    3547 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:39.998284    3547 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:39.998400    3547 main.go:141] libmachine: (ha-949000-m03) Calling .GetState
	I0831 15:34:39.998497    3547 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:39.998566    3547 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:34:39.999560    3547 status.go:330] ha-949000-m03 host status = "Running" (err=<nil>)
	I0831 15:34:39.999572    3547 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:34:39.999820    3547 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:39.999844    3547 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:40.008449    3547 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51475
	I0831 15:34:40.008775    3547 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:40.009137    3547 main.go:141] libmachine: Using API Version  1
	I0831 15:34:40.009151    3547 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:40.009375    3547 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:40.009491    3547 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:34:40.009570    3547 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:34:40.009819    3547 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:40.009842    3547 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:40.018558    3547 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51477
	I0831 15:34:40.018889    3547 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:40.019221    3547 main.go:141] libmachine: Using API Version  1
	I0831 15:34:40.019240    3547 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:40.019442    3547 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:40.019551    3547 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:34:40.019682    3547 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:40.019694    3547 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:34:40.019771    3547 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:34:40.019847    3547 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:34:40.019961    3547 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:34:40.020043    3547 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:34:40.047676    3547 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:40.059732    3547 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:34:40.059746    3547 api_server.go:166] Checking apiserver status ...
	I0831 15:34:40.059785    3547 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:34:40.071949    3547 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1944/cgroup
	W0831 15:34:40.080111    3547 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1944/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:34:40.080157    3547 ssh_runner.go:195] Run: ls
	I0831 15:34:40.083345    3547 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:34:40.086414    3547 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:34:40.086429    3547 status.go:422] ha-949000-m03 apiserver status = Running (err=<nil>)
	I0831 15:34:40.086437    3547 status.go:257] ha-949000-m03 status: &{Name:ha-949000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:34:40.086447    3547 status.go:255] checking status of ha-949000-m04 ...
	I0831 15:34:40.086730    3547 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:40.086755    3547 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:40.095790    3547 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51481
	I0831 15:34:40.096142    3547 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:40.096489    3547 main.go:141] libmachine: Using API Version  1
	I0831 15:34:40.096504    3547 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:40.096731    3547 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:40.096851    3547 main.go:141] libmachine: (ha-949000-m04) Calling .GetState
	I0831 15:34:40.096936    3547 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:40.097013    3547 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3377
	I0831 15:34:40.097993    3547 status.go:330] ha-949000-m04 host status = "Running" (err=<nil>)
	I0831 15:34:40.098003    3547 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:34:40.098259    3547 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:40.098284    3547 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:40.106705    3547 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51483
	I0831 15:34:40.107040    3547 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:40.107371    3547 main.go:141] libmachine: Using API Version  1
	I0831 15:34:40.107381    3547 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:40.107632    3547 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:40.107749    3547 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:34:40.107838    3547 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:34:40.108092    3547 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:40.108129    3547 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:40.117000    3547 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51485
	I0831 15:34:40.117341    3547 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:40.117698    3547 main.go:141] libmachine: Using API Version  1
	I0831 15:34:40.117714    3547 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:40.117920    3547 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:40.118032    3547 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:34:40.118160    3547 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:40.118171    3547 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:34:40.118255    3547 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:34:40.118328    3547 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:34:40.118406    3547 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:34:40.118487    3547 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:34:40.154631    3547 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:40.166026    3547 status.go:257] ha-949000-m04 status: &{Name:ha-949000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr: exit status 2 (451.461625ms)

                                                
                                                
-- stdout --
	ha-949000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 15:34:41.407111    3561 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:34:41.407795    3561 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:34:41.407805    3561 out.go:358] Setting ErrFile to fd 2...
	I0831 15:34:41.407811    3561 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:34:41.408292    3561 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:34:41.408495    3561 out.go:352] Setting JSON to false
	I0831 15:34:41.408519    3561 mustload.go:65] Loading cluster: ha-949000
	I0831 15:34:41.408553    3561 notify.go:220] Checking for updates...
	I0831 15:34:41.408798    3561 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:34:41.408813    3561 status.go:255] checking status of ha-949000 ...
	I0831 15:34:41.409149    3561 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:41.409192    3561 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:41.417933    3561 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51489
	I0831 15:34:41.418250    3561 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:41.418654    3561 main.go:141] libmachine: Using API Version  1
	I0831 15:34:41.418662    3561 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:41.418859    3561 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:41.418977    3561 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:34:41.419056    3561 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:41.419121    3561 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:34:41.420120    3561 status.go:330] ha-949000 host status = "Running" (err=<nil>)
	I0831 15:34:41.420146    3561 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:34:41.420402    3561 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:41.420428    3561 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:41.428842    3561 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51491
	I0831 15:34:41.429177    3561 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:41.429541    3561 main.go:141] libmachine: Using API Version  1
	I0831 15:34:41.429564    3561 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:41.429828    3561 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:41.429948    3561 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:34:41.430059    3561 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:34:41.430316    3561 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:41.430353    3561 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:41.439737    3561 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51493
	I0831 15:34:41.440101    3561 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:41.440455    3561 main.go:141] libmachine: Using API Version  1
	I0831 15:34:41.440469    3561 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:41.440668    3561 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:41.440767    3561 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:34:41.440907    3561 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:41.440927    3561 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:34:41.441016    3561 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:34:41.441094    3561 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:34:41.441164    3561 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:34:41.441252    3561 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:34:41.477748    3561 ssh_runner.go:195] Run: systemctl --version
	I0831 15:34:41.482439    3561 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:41.495866    3561 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:34:41.495890    3561 api_server.go:166] Checking apiserver status ...
	I0831 15:34:41.495934    3561 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:34:41.507703    3561 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup
	W0831 15:34:41.515457    3561 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:34:41.515513    3561 ssh_runner.go:195] Run: ls
	I0831 15:34:41.518852    3561 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:34:41.521926    3561 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:34:41.521938    3561 status.go:422] ha-949000 apiserver status = Running (err=<nil>)
	I0831 15:34:41.521947    3561 status.go:257] ha-949000 status: &{Name:ha-949000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:34:41.521957    3561 status.go:255] checking status of ha-949000-m02 ...
	I0831 15:34:41.522214    3561 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:41.522235    3561 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:41.531090    3561 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51497
	I0831 15:34:41.531449    3561 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:41.531802    3561 main.go:141] libmachine: Using API Version  1
	I0831 15:34:41.531816    3561 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:41.532012    3561 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:41.532116    3561 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:34:41.532194    3561 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:41.532261    3561 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 3528
	I0831 15:34:41.533236    3561 status.go:330] ha-949000-m02 host status = "Running" (err=<nil>)
	I0831 15:34:41.533245    3561 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:34:41.533493    3561 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:41.533515    3561 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:41.542343    3561 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51499
	I0831 15:34:41.542650    3561 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:41.542997    3561 main.go:141] libmachine: Using API Version  1
	I0831 15:34:41.543014    3561 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:41.543222    3561 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:41.543329    3561 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:34:41.543403    3561 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:34:41.543662    3561 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:41.543692    3561 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:41.552248    3561 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51501
	I0831 15:34:41.552602    3561 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:41.552962    3561 main.go:141] libmachine: Using API Version  1
	I0831 15:34:41.552974    3561 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:41.553198    3561 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:41.553318    3561 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:34:41.553453    3561 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:41.553471    3561 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:34:41.553559    3561 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:34:41.553648    3561 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:34:41.553740    3561 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:34:41.553842    3561 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:34:41.582911    3561 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:41.594541    3561 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:34:41.594557    3561 api_server.go:166] Checking apiserver status ...
	I0831 15:34:41.594596    3561 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:34:41.606800    3561 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1981/cgroup
	W0831 15:34:41.614791    3561 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1981/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:34:41.614846    3561 ssh_runner.go:195] Run: ls
	I0831 15:34:41.618209    3561 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:34:41.621263    3561 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:34:41.621275    3561 status.go:422] ha-949000-m02 apiserver status = Running (err=<nil>)
	I0831 15:34:41.621283    3561 status.go:257] ha-949000-m02 status: &{Name:ha-949000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:34:41.621293    3561 status.go:255] checking status of ha-949000-m03 ...
	I0831 15:34:41.621550    3561 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:41.621572    3561 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:41.630131    3561 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51505
	I0831 15:34:41.630469    3561 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:41.630812    3561 main.go:141] libmachine: Using API Version  1
	I0831 15:34:41.630828    3561 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:41.631049    3561 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:41.631185    3561 main.go:141] libmachine: (ha-949000-m03) Calling .GetState
	I0831 15:34:41.631282    3561 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:41.631358    3561 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:34:41.632354    3561 status.go:330] ha-949000-m03 host status = "Running" (err=<nil>)
	I0831 15:34:41.632363    3561 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:34:41.632632    3561 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:41.632671    3561 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:41.641156    3561 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51507
	I0831 15:34:41.641494    3561 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:41.641829    3561 main.go:141] libmachine: Using API Version  1
	I0831 15:34:41.641845    3561 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:41.642074    3561 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:41.642187    3561 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:34:41.642269    3561 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:34:41.642534    3561 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:41.642559    3561 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:41.650908    3561 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51509
	I0831 15:34:41.651236    3561 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:41.651553    3561 main.go:141] libmachine: Using API Version  1
	I0831 15:34:41.651563    3561 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:41.651786    3561 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:41.651905    3561 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:34:41.652034    3561 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:41.652045    3561 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:34:41.652121    3561 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:34:41.652195    3561 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:34:41.652276    3561 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:34:41.652357    3561 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:34:41.680741    3561 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:41.692597    3561 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:34:41.692612    3561 api_server.go:166] Checking apiserver status ...
	I0831 15:34:41.692650    3561 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:34:41.704188    3561 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1944/cgroup
	W0831 15:34:41.712904    3561 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1944/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:34:41.712964    3561 ssh_runner.go:195] Run: ls
	I0831 15:34:41.716315    3561 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:34:41.720214    3561 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:34:41.720231    3561 status.go:422] ha-949000-m03 apiserver status = Running (err=<nil>)
	I0831 15:34:41.720241    3561 status.go:257] ha-949000-m03 status: &{Name:ha-949000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:34:41.720253    3561 status.go:255] checking status of ha-949000-m04 ...
	I0831 15:34:41.720564    3561 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:41.720594    3561 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:41.730154    3561 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51513
	I0831 15:34:41.730525    3561 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:41.730908    3561 main.go:141] libmachine: Using API Version  1
	I0831 15:34:41.730923    3561 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:41.731145    3561 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:41.731259    3561 main.go:141] libmachine: (ha-949000-m04) Calling .GetState
	I0831 15:34:41.731350    3561 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:41.731447    3561 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3377
	I0831 15:34:41.732453    3561 status.go:330] ha-949000-m04 host status = "Running" (err=<nil>)
	I0831 15:34:41.732463    3561 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:34:41.732751    3561 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:41.732783    3561 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:41.742386    3561 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51515
	I0831 15:34:41.742732    3561 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:41.743077    3561 main.go:141] libmachine: Using API Version  1
	I0831 15:34:41.743091    3561 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:41.743320    3561 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:41.743427    3561 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:34:41.743514    3561 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:34:41.743776    3561 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:41.743803    3561 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:41.752599    3561 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51517
	I0831 15:34:41.752932    3561 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:41.753272    3561 main.go:141] libmachine: Using API Version  1
	I0831 15:34:41.753288    3561 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:41.753488    3561 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:41.753591    3561 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:34:41.753721    3561 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:41.753733    3561 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:34:41.753819    3561 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:34:41.753895    3561 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:34:41.753983    3561 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:34:41.754067    3561 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:34:41.789631    3561 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:41.801061    3561 status.go:257] ha-949000-m04 status: &{Name:ha-949000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr: exit status 2 (458.517151ms)

                                                
                                                
-- stdout --
	ha-949000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 15:34:43.744860    3575 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:34:43.745047    3575 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:34:43.745053    3575 out.go:358] Setting ErrFile to fd 2...
	I0831 15:34:43.745056    3575 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:34:43.745239    3575 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:34:43.745414    3575 out.go:352] Setting JSON to false
	I0831 15:34:43.745437    3575 mustload.go:65] Loading cluster: ha-949000
	I0831 15:34:43.745476    3575 notify.go:220] Checking for updates...
	I0831 15:34:43.745740    3575 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:34:43.745755    3575 status.go:255] checking status of ha-949000 ...
	I0831 15:34:43.746122    3575 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:43.746175    3575 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:43.755172    3575 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51521
	I0831 15:34:43.755486    3575 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:43.755914    3575 main.go:141] libmachine: Using API Version  1
	I0831 15:34:43.755925    3575 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:43.756182    3575 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:43.756301    3575 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:34:43.756387    3575 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:43.756454    3575 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:34:43.757446    3575 status.go:330] ha-949000 host status = "Running" (err=<nil>)
	I0831 15:34:43.757467    3575 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:34:43.757709    3575 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:43.757732    3575 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:43.766298    3575 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51523
	I0831 15:34:43.772670    3575 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:43.773062    3575 main.go:141] libmachine: Using API Version  1
	I0831 15:34:43.773078    3575 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:43.773284    3575 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:43.773383    3575 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:34:43.773461    3575 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:34:43.773697    3575 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:43.773721    3575 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:43.782307    3575 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51525
	I0831 15:34:43.782643    3575 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:43.782939    3575 main.go:141] libmachine: Using API Version  1
	I0831 15:34:43.782953    3575 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:43.783174    3575 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:43.783281    3575 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:34:43.783416    3575 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:43.783433    3575 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:34:43.783510    3575 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:34:43.783588    3575 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:34:43.783669    3575 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:34:43.783748    3575 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:34:43.819706    3575 ssh_runner.go:195] Run: systemctl --version
	I0831 15:34:43.824079    3575 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:43.834931    3575 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:34:43.834953    3575 api_server.go:166] Checking apiserver status ...
	I0831 15:34:43.834989    3575 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:34:43.849313    3575 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup
	W0831 15:34:43.856772    3575 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:34:43.856829    3575 ssh_runner.go:195] Run: ls
	I0831 15:34:43.860408    3575 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:34:43.864002    3575 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:34:43.864016    3575 status.go:422] ha-949000 apiserver status = Running (err=<nil>)
	I0831 15:34:43.864025    3575 status.go:257] ha-949000 status: &{Name:ha-949000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:34:43.864041    3575 status.go:255] checking status of ha-949000-m02 ...
	I0831 15:34:43.864359    3575 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:43.864386    3575 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:43.873462    3575 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51529
	I0831 15:34:43.873794    3575 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:43.874129    3575 main.go:141] libmachine: Using API Version  1
	I0831 15:34:43.874154    3575 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:43.874354    3575 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:43.874458    3575 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:34:43.874545    3575 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:43.874615    3575 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 3528
	I0831 15:34:43.875608    3575 status.go:330] ha-949000-m02 host status = "Running" (err=<nil>)
	I0831 15:34:43.875619    3575 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:34:43.875865    3575 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:43.875894    3575 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:43.884584    3575 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51531
	I0831 15:34:43.884960    3575 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:43.885380    3575 main.go:141] libmachine: Using API Version  1
	I0831 15:34:43.885409    3575 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:43.885622    3575 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:43.885741    3575 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:34:43.885830    3575 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:34:43.886101    3575 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:43.886126    3575 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:43.894827    3575 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51533
	I0831 15:34:43.895159    3575 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:43.895477    3575 main.go:141] libmachine: Using API Version  1
	I0831 15:34:43.895486    3575 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:43.895699    3575 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:43.895799    3575 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:34:43.895922    3575 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:43.895934    3575 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:34:43.896015    3575 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:34:43.896100    3575 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:34:43.896195    3575 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:34:43.896276    3575 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:34:43.925817    3575 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:43.936170    3575 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:34:43.936187    3575 api_server.go:166] Checking apiserver status ...
	I0831 15:34:43.936231    3575 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:34:43.948262    3575 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1981/cgroup
	W0831 15:34:43.956633    3575 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1981/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:34:43.956692    3575 ssh_runner.go:195] Run: ls
	I0831 15:34:43.959841    3575 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:34:43.962921    3575 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:34:43.962933    3575 status.go:422] ha-949000-m02 apiserver status = Running (err=<nil>)
	I0831 15:34:43.962943    3575 status.go:257] ha-949000-m02 status: &{Name:ha-949000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:34:43.962954    3575 status.go:255] checking status of ha-949000-m03 ...
	I0831 15:34:43.963210    3575 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:43.963230    3575 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:43.972011    3575 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51537
	I0831 15:34:43.972373    3575 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:43.972703    3575 main.go:141] libmachine: Using API Version  1
	I0831 15:34:43.972715    3575 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:43.972931    3575 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:43.973027    3575 main.go:141] libmachine: (ha-949000-m03) Calling .GetState
	I0831 15:34:43.973102    3575 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:43.973181    3575 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:34:43.974155    3575 status.go:330] ha-949000-m03 host status = "Running" (err=<nil>)
	I0831 15:34:43.974163    3575 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:34:43.974403    3575 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:43.974431    3575 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:43.983040    3575 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51539
	I0831 15:34:43.983365    3575 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:43.983728    3575 main.go:141] libmachine: Using API Version  1
	I0831 15:34:43.983752    3575 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:43.983965    3575 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:43.984061    3575 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:34:43.984142    3575 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:34:43.984415    3575 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:43.984437    3575 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:43.992856    3575 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51541
	I0831 15:34:43.993185    3575 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:43.993483    3575 main.go:141] libmachine: Using API Version  1
	I0831 15:34:43.993494    3575 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:43.993719    3575 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:43.993829    3575 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:34:43.993946    3575 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:43.993957    3575 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:34:43.994033    3575 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:34:43.994107    3575 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:34:43.994179    3575 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:34:43.994280    3575 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:34:44.022396    3575 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:44.033826    3575 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:34:44.033839    3575 api_server.go:166] Checking apiserver status ...
	I0831 15:34:44.033875    3575 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:34:44.045301    3575 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1944/cgroup
	W0831 15:34:44.053564    3575 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1944/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:34:44.053608    3575 ssh_runner.go:195] Run: ls
	I0831 15:34:44.056720    3575 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:34:44.059771    3575 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:34:44.059783    3575 status.go:422] ha-949000-m03 apiserver status = Running (err=<nil>)
	I0831 15:34:44.059790    3575 status.go:257] ha-949000-m03 status: &{Name:ha-949000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:34:44.059800    3575 status.go:255] checking status of ha-949000-m04 ...
	I0831 15:34:44.060059    3575 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:44.060079    3575 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:44.068875    3575 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51545
	I0831 15:34:44.069213    3575 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:44.069554    3575 main.go:141] libmachine: Using API Version  1
	I0831 15:34:44.069573    3575 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:44.069773    3575 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:44.069885    3575 main.go:141] libmachine: (ha-949000-m04) Calling .GetState
	I0831 15:34:44.069969    3575 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:44.070037    3575 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3377
	I0831 15:34:44.071091    3575 status.go:330] ha-949000-m04 host status = "Running" (err=<nil>)
	I0831 15:34:44.071101    3575 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:34:44.071350    3575 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:44.071379    3575 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:44.079946    3575 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51547
	I0831 15:34:44.080265    3575 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:44.080595    3575 main.go:141] libmachine: Using API Version  1
	I0831 15:34:44.080611    3575 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:44.080802    3575 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:44.080904    3575 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:34:44.080988    3575 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:34:44.081224    3575 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:44.081249    3575 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:44.089631    3575 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51549
	I0831 15:34:44.090014    3575 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:44.090367    3575 main.go:141] libmachine: Using API Version  1
	I0831 15:34:44.090381    3575 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:44.090577    3575 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:44.090700    3575 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:34:44.090830    3575 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:44.090842    3575 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:34:44.090922    3575 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:34:44.091004    3575 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:34:44.091113    3575 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:34:44.091188    3575 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:34:44.128851    3575 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:44.145318    3575 status.go:257] ha-949000-m04 status: &{Name:ha-949000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr: exit status 2 (451.950393ms)

                                                
                                                
-- stdout --
	ha-949000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 15:34:47.414153    3589 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:34:47.414343    3589 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:34:47.414349    3589 out.go:358] Setting ErrFile to fd 2...
	I0831 15:34:47.414353    3589 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:34:47.414518    3589 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:34:47.414683    3589 out.go:352] Setting JSON to false
	I0831 15:34:47.414710    3589 mustload.go:65] Loading cluster: ha-949000
	I0831 15:34:47.414762    3589 notify.go:220] Checking for updates...
	I0831 15:34:47.415030    3589 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:34:47.415046    3589 status.go:255] checking status of ha-949000 ...
	I0831 15:34:47.415392    3589 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:47.415446    3589 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:47.424561    3589 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51553
	I0831 15:34:47.424931    3589 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:47.425339    3589 main.go:141] libmachine: Using API Version  1
	I0831 15:34:47.425348    3589 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:47.425556    3589 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:47.425673    3589 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:34:47.425754    3589 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:47.425824    3589 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:34:47.426844    3589 status.go:330] ha-949000 host status = "Running" (err=<nil>)
	I0831 15:34:47.426867    3589 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:34:47.427124    3589 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:47.427145    3589 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:47.435699    3589 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51555
	I0831 15:34:47.436056    3589 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:47.436448    3589 main.go:141] libmachine: Using API Version  1
	I0831 15:34:47.436468    3589 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:47.436675    3589 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:47.436774    3589 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:34:47.436861    3589 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:34:47.437120    3589 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:47.437149    3589 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:47.445672    3589 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51557
	I0831 15:34:47.445998    3589 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:47.446310    3589 main.go:141] libmachine: Using API Version  1
	I0831 15:34:47.446320    3589 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:47.446531    3589 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:47.446647    3589 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:34:47.446790    3589 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:47.446807    3589 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:34:47.446888    3589 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:34:47.446967    3589 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:34:47.447051    3589 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:34:47.447136    3589 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:34:47.483626    3589 ssh_runner.go:195] Run: systemctl --version
	I0831 15:34:47.488426    3589 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:47.500442    3589 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:34:47.500471    3589 api_server.go:166] Checking apiserver status ...
	I0831 15:34:47.500513    3589 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:34:47.513074    3589 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup
	W0831 15:34:47.521671    3589 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:34:47.521736    3589 ssh_runner.go:195] Run: ls
	I0831 15:34:47.524859    3589 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:34:47.528089    3589 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:34:47.528101    3589 status.go:422] ha-949000 apiserver status = Running (err=<nil>)
	I0831 15:34:47.528110    3589 status.go:257] ha-949000 status: &{Name:ha-949000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:34:47.528124    3589 status.go:255] checking status of ha-949000-m02 ...
	I0831 15:34:47.528374    3589 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:47.528395    3589 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:47.537037    3589 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51561
	I0831 15:34:47.537381    3589 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:47.537731    3589 main.go:141] libmachine: Using API Version  1
	I0831 15:34:47.537746    3589 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:47.537975    3589 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:47.538092    3589 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:34:47.538167    3589 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:47.538241    3589 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 3528
	I0831 15:34:47.539261    3589 status.go:330] ha-949000-m02 host status = "Running" (err=<nil>)
	I0831 15:34:47.539273    3589 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:34:47.539517    3589 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:47.539544    3589 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:47.547936    3589 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51563
	I0831 15:34:47.548267    3589 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:47.548594    3589 main.go:141] libmachine: Using API Version  1
	I0831 15:34:47.548604    3589 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:47.548796    3589 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:47.548893    3589 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:34:47.548970    3589 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:34:47.549217    3589 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:47.549238    3589 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:47.557545    3589 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51565
	I0831 15:34:47.557902    3589 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:47.558206    3589 main.go:141] libmachine: Using API Version  1
	I0831 15:34:47.558215    3589 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:47.558425    3589 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:47.558528    3589 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:34:47.558649    3589 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:47.558661    3589 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:34:47.558746    3589 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:34:47.558820    3589 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:34:47.558900    3589 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:34:47.558977    3589 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:34:47.588656    3589 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:47.601911    3589 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:34:47.601926    3589 api_server.go:166] Checking apiserver status ...
	I0831 15:34:47.601978    3589 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:34:47.614461    3589 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1981/cgroup
	W0831 15:34:47.622838    3589 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1981/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:34:47.622893    3589 ssh_runner.go:195] Run: ls
	I0831 15:34:47.626125    3589 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:34:47.629267    3589 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:34:47.629282    3589 status.go:422] ha-949000-m02 apiserver status = Running (err=<nil>)
	I0831 15:34:47.629291    3589 status.go:257] ha-949000-m02 status: &{Name:ha-949000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:34:47.629301    3589 status.go:255] checking status of ha-949000-m03 ...
	I0831 15:34:47.629563    3589 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:47.629584    3589 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:47.638282    3589 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51569
	I0831 15:34:47.638679    3589 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:47.639095    3589 main.go:141] libmachine: Using API Version  1
	I0831 15:34:47.639111    3589 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:47.639404    3589 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:47.639558    3589 main.go:141] libmachine: (ha-949000-m03) Calling .GetState
	I0831 15:34:47.639678    3589 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:47.639798    3589 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:34:47.640940    3589 status.go:330] ha-949000-m03 host status = "Running" (err=<nil>)
	I0831 15:34:47.640952    3589 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:34:47.641215    3589 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:47.641245    3589 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:47.649694    3589 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51571
	I0831 15:34:47.650027    3589 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:47.650329    3589 main.go:141] libmachine: Using API Version  1
	I0831 15:34:47.650348    3589 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:47.650562    3589 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:47.650671    3589 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:34:47.650765    3589 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:34:47.651025    3589 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:47.651049    3589 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:47.659499    3589 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51573
	I0831 15:34:47.659853    3589 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:47.660234    3589 main.go:141] libmachine: Using API Version  1
	I0831 15:34:47.660245    3589 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:47.660459    3589 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:47.660572    3589 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:34:47.660701    3589 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:47.660720    3589 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:34:47.660794    3589 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:34:47.660887    3589 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:34:47.660962    3589 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:34:47.661045    3589 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:34:47.689557    3589 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:47.701604    3589 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:34:47.701618    3589 api_server.go:166] Checking apiserver status ...
	I0831 15:34:47.701661    3589 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:34:47.713889    3589 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1944/cgroup
	W0831 15:34:47.721977    3589 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1944/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:34:47.722021    3589 ssh_runner.go:195] Run: ls
	I0831 15:34:47.725218    3589 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:34:47.728276    3589 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:34:47.728298    3589 status.go:422] ha-949000-m03 apiserver status = Running (err=<nil>)
	I0831 15:34:47.728306    3589 status.go:257] ha-949000-m03 status: &{Name:ha-949000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:34:47.728317    3589 status.go:255] checking status of ha-949000-m04 ...
	I0831 15:34:47.728600    3589 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:47.728622    3589 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:47.737280    3589 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51577
	I0831 15:34:47.737624    3589 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:47.737970    3589 main.go:141] libmachine: Using API Version  1
	I0831 15:34:47.737985    3589 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:47.738209    3589 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:47.738324    3589 main.go:141] libmachine: (ha-949000-m04) Calling .GetState
	I0831 15:34:47.738403    3589 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:47.738473    3589 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3377
	I0831 15:34:47.739507    3589 status.go:330] ha-949000-m04 host status = "Running" (err=<nil>)
	I0831 15:34:47.739515    3589 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:34:47.739762    3589 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:47.739787    3589 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:47.748192    3589 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51579
	I0831 15:34:47.748508    3589 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:47.748818    3589 main.go:141] libmachine: Using API Version  1
	I0831 15:34:47.748827    3589 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:47.749055    3589 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:47.749170    3589 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:34:47.749278    3589 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:34:47.749551    3589 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:47.749575    3589 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:47.757957    3589 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51581
	I0831 15:34:47.758282    3589 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:47.758605    3589 main.go:141] libmachine: Using API Version  1
	I0831 15:34:47.758616    3589 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:47.758808    3589 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:47.758901    3589 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:34:47.759019    3589 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:47.759037    3589 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:34:47.759116    3589 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:34:47.759206    3589 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:34:47.759284    3589 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:34:47.759353    3589 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:34:47.796048    3589 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:47.807434    3589 status.go:257] ha-949000-m04 status: &{Name:ha-949000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr: exit status 2 (447.70085ms)

                                                
                                                
-- stdout --
	ha-949000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 15:34:51.785060    3603 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:34:51.785238    3603 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:34:51.785244    3603 out.go:358] Setting ErrFile to fd 2...
	I0831 15:34:51.785248    3603 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:34:51.785432    3603 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:34:51.785606    3603 out.go:352] Setting JSON to false
	I0831 15:34:51.785633    3603 mustload.go:65] Loading cluster: ha-949000
	I0831 15:34:51.785671    3603 notify.go:220] Checking for updates...
	I0831 15:34:51.785948    3603 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:34:51.785966    3603 status.go:255] checking status of ha-949000 ...
	I0831 15:34:51.786311    3603 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:51.786356    3603 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:51.794991    3603 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51585
	I0831 15:34:51.795330    3603 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:51.795773    3603 main.go:141] libmachine: Using API Version  1
	I0831 15:34:51.795786    3603 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:51.796029    3603 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:51.796142    3603 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:34:51.796238    3603 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:51.796302    3603 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:34:51.797274    3603 status.go:330] ha-949000 host status = "Running" (err=<nil>)
	I0831 15:34:51.797296    3603 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:34:51.797538    3603 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:51.797557    3603 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:51.805900    3603 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51587
	I0831 15:34:51.806259    3603 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:51.806595    3603 main.go:141] libmachine: Using API Version  1
	I0831 15:34:51.806609    3603 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:51.806805    3603 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:51.806897    3603 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:34:51.806986    3603 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:34:51.807242    3603 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:51.807263    3603 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:51.817426    3603 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51589
	I0831 15:34:51.817756    3603 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:51.818080    3603 main.go:141] libmachine: Using API Version  1
	I0831 15:34:51.818091    3603 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:51.818302    3603 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:51.818405    3603 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:34:51.818547    3603 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:51.818566    3603 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:34:51.818640    3603 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:34:51.818735    3603 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:34:51.818819    3603 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:34:51.818897    3603 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:34:51.856097    3603 ssh_runner.go:195] Run: systemctl --version
	I0831 15:34:51.861795    3603 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:51.872594    3603 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:34:51.872618    3603 api_server.go:166] Checking apiserver status ...
	I0831 15:34:51.872667    3603 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:34:51.883650    3603 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup
	W0831 15:34:51.890898    3603 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:34:51.890951    3603 ssh_runner.go:195] Run: ls
	I0831 15:34:51.894157    3603 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:34:51.898508    3603 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:34:51.898521    3603 status.go:422] ha-949000 apiserver status = Running (err=<nil>)
	I0831 15:34:51.898530    3603 status.go:257] ha-949000 status: &{Name:ha-949000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:34:51.898541    3603 status.go:255] checking status of ha-949000-m02 ...
	I0831 15:34:51.898788    3603 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:51.898823    3603 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:51.907353    3603 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51593
	I0831 15:34:51.907681    3603 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:51.908005    3603 main.go:141] libmachine: Using API Version  1
	I0831 15:34:51.908016    3603 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:51.908203    3603 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:51.908315    3603 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:34:51.908396    3603 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:51.908469    3603 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 3528
	I0831 15:34:51.909449    3603 status.go:330] ha-949000-m02 host status = "Running" (err=<nil>)
	I0831 15:34:51.909459    3603 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:34:51.909697    3603 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:51.909721    3603 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:51.918451    3603 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51595
	I0831 15:34:51.918801    3603 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:51.919165    3603 main.go:141] libmachine: Using API Version  1
	I0831 15:34:51.919183    3603 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:51.919403    3603 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:51.919528    3603 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:34:51.919682    3603 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:34:51.919984    3603 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:51.920007    3603 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:51.929538    3603 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51597
	I0831 15:34:51.929894    3603 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:51.930229    3603 main.go:141] libmachine: Using API Version  1
	I0831 15:34:51.930243    3603 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:51.930460    3603 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:51.930562    3603 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:34:51.930694    3603 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:51.930707    3603 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:34:51.930780    3603 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:34:51.930866    3603 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:34:51.930957    3603 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:34:51.931033    3603 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:34:51.960514    3603 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:51.971406    3603 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:34:51.971421    3603 api_server.go:166] Checking apiserver status ...
	I0831 15:34:51.971460    3603 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:34:51.983567    3603 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1981/cgroup
	W0831 15:34:51.990929    3603 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1981/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:34:51.990986    3603 ssh_runner.go:195] Run: ls
	I0831 15:34:51.994185    3603 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:34:51.997253    3603 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:34:51.997264    3603 status.go:422] ha-949000-m02 apiserver status = Running (err=<nil>)
	I0831 15:34:51.997272    3603 status.go:257] ha-949000-m02 status: &{Name:ha-949000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:34:51.997288    3603 status.go:255] checking status of ha-949000-m03 ...
	I0831 15:34:51.997555    3603 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:51.997577    3603 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:52.006288    3603 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51601
	I0831 15:34:52.006647    3603 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:52.006996    3603 main.go:141] libmachine: Using API Version  1
	I0831 15:34:52.007011    3603 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:52.007213    3603 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:52.007327    3603 main.go:141] libmachine: (ha-949000-m03) Calling .GetState
	I0831 15:34:52.007408    3603 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:52.007479    3603 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:34:52.008492    3603 status.go:330] ha-949000-m03 host status = "Running" (err=<nil>)
	I0831 15:34:52.008502    3603 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:34:52.008757    3603 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:52.008784    3603 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:52.017326    3603 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51603
	I0831 15:34:52.017680    3603 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:52.018006    3603 main.go:141] libmachine: Using API Version  1
	I0831 15:34:52.018016    3603 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:52.018209    3603 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:52.018314    3603 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:34:52.018395    3603 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:34:52.018632    3603 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:52.018661    3603 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:52.027138    3603 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51605
	I0831 15:34:52.027482    3603 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:52.027823    3603 main.go:141] libmachine: Using API Version  1
	I0831 15:34:52.027840    3603 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:52.028053    3603 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:52.028165    3603 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:34:52.028296    3603 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:52.028308    3603 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:34:52.028381    3603 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:34:52.028455    3603 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:34:52.028535    3603 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:34:52.028617    3603 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:34:52.056016    3603 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:52.067970    3603 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:34:52.067985    3603 api_server.go:166] Checking apiserver status ...
	I0831 15:34:52.068026    3603 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:34:52.080188    3603 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1944/cgroup
	W0831 15:34:52.088017    3603 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1944/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:34:52.088064    3603 ssh_runner.go:195] Run: ls
	I0831 15:34:52.091667    3603 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:34:52.094678    3603 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:34:52.094690    3603 status.go:422] ha-949000-m03 apiserver status = Running (err=<nil>)
	I0831 15:34:52.094698    3603 status.go:257] ha-949000-m03 status: &{Name:ha-949000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:34:52.094710    3603 status.go:255] checking status of ha-949000-m04 ...
	I0831 15:34:52.094958    3603 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:52.094978    3603 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:52.103728    3603 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51609
	I0831 15:34:52.104077    3603 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:52.104440    3603 main.go:141] libmachine: Using API Version  1
	I0831 15:34:52.104457    3603 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:52.104657    3603 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:52.104759    3603 main.go:141] libmachine: (ha-949000-m04) Calling .GetState
	I0831 15:34:52.104838    3603 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:52.104907    3603 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3377
	I0831 15:34:52.105902    3603 status.go:330] ha-949000-m04 host status = "Running" (err=<nil>)
	I0831 15:34:52.105914    3603 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:34:52.106193    3603 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:52.106218    3603 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:52.114724    3603 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51611
	I0831 15:34:52.115074    3603 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:52.115426    3603 main.go:141] libmachine: Using API Version  1
	I0831 15:34:52.115440    3603 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:52.115662    3603 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:52.115764    3603 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:34:52.115842    3603 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:34:52.116104    3603 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:52.116127    3603 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:52.124691    3603 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51613
	I0831 15:34:52.125051    3603 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:52.125410    3603 main.go:141] libmachine: Using API Version  1
	I0831 15:34:52.125425    3603 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:52.125650    3603 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:52.125759    3603 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:34:52.125911    3603 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:52.125923    3603 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:34:52.126036    3603 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:34:52.126148    3603 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:34:52.126267    3603 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:34:52.126366    3603 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:34:52.163086    3603 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:52.174852    3603 status.go:257] ha-949000-m04 status: &{Name:ha-949000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr: exit status 2 (448.484471ms)

                                                
                                                
-- stdout --
	ha-949000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 15:34:56.598630    3619 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:34:56.598894    3619 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:34:56.598899    3619 out.go:358] Setting ErrFile to fd 2...
	I0831 15:34:56.598903    3619 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:34:56.599073    3619 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:34:56.599253    3619 out.go:352] Setting JSON to false
	I0831 15:34:56.599276    3619 mustload.go:65] Loading cluster: ha-949000
	I0831 15:34:56.599316    3619 notify.go:220] Checking for updates...
	I0831 15:34:56.599566    3619 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:34:56.599582    3619 status.go:255] checking status of ha-949000 ...
	I0831 15:34:56.599934    3619 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:56.599989    3619 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:56.608788    3619 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51617
	I0831 15:34:56.609190    3619 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:56.609598    3619 main.go:141] libmachine: Using API Version  1
	I0831 15:34:56.609607    3619 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:56.609819    3619 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:56.609943    3619 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:34:56.610024    3619 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:56.610098    3619 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:34:56.611081    3619 status.go:330] ha-949000 host status = "Running" (err=<nil>)
	I0831 15:34:56.611104    3619 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:34:56.611370    3619 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:56.611389    3619 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:56.620036    3619 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51619
	I0831 15:34:56.620392    3619 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:56.620726    3619 main.go:141] libmachine: Using API Version  1
	I0831 15:34:56.620765    3619 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:56.620981    3619 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:56.621096    3619 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:34:56.621182    3619 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:34:56.621445    3619 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:56.621473    3619 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:56.629970    3619 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51621
	I0831 15:34:56.630293    3619 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:56.630633    3619 main.go:141] libmachine: Using API Version  1
	I0831 15:34:56.630648    3619 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:56.630834    3619 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:56.630930    3619 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:34:56.631059    3619 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:56.631078    3619 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:34:56.631153    3619 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:34:56.631226    3619 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:34:56.631311    3619 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:34:56.631385    3619 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:34:56.668643    3619 ssh_runner.go:195] Run: systemctl --version
	I0831 15:34:56.673220    3619 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:56.683980    3619 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:34:56.684004    3619 api_server.go:166] Checking apiserver status ...
	I0831 15:34:56.684048    3619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:34:56.694920    3619 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup
	W0831 15:34:56.702575    3619 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:34:56.702642    3619 ssh_runner.go:195] Run: ls
	I0831 15:34:56.705956    3619 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:34:56.710256    3619 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:34:56.710274    3619 status.go:422] ha-949000 apiserver status = Running (err=<nil>)
	I0831 15:34:56.710282    3619 status.go:257] ha-949000 status: &{Name:ha-949000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:34:56.710293    3619 status.go:255] checking status of ha-949000-m02 ...
	I0831 15:34:56.710572    3619 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:56.710613    3619 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:56.719152    3619 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51625
	I0831 15:34:56.719489    3619 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:56.719816    3619 main.go:141] libmachine: Using API Version  1
	I0831 15:34:56.719830    3619 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:56.720048    3619 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:56.720145    3619 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:34:56.720237    3619 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:56.720300    3619 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 3528
	I0831 15:34:56.721286    3619 status.go:330] ha-949000-m02 host status = "Running" (err=<nil>)
	I0831 15:34:56.721295    3619 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:34:56.721557    3619 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:56.721581    3619 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:56.730028    3619 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51627
	I0831 15:34:56.730369    3619 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:56.730719    3619 main.go:141] libmachine: Using API Version  1
	I0831 15:34:56.730737    3619 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:56.730943    3619 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:56.731042    3619 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:34:56.731127    3619 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:34:56.731396    3619 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:56.731419    3619 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:56.739977    3619 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51629
	I0831 15:34:56.740327    3619 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:56.740674    3619 main.go:141] libmachine: Using API Version  1
	I0831 15:34:56.740690    3619 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:56.740889    3619 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:56.741023    3619 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:34:56.741152    3619 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:56.741163    3619 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:34:56.741247    3619 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:34:56.741323    3619 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:34:56.741406    3619 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:34:56.741488    3619 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:34:56.773381    3619 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:56.785192    3619 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:34:56.785210    3619 api_server.go:166] Checking apiserver status ...
	I0831 15:34:56.785248    3619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:34:56.796419    3619 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1981/cgroup
	W0831 15:34:56.804270    3619 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1981/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:34:56.804312    3619 ssh_runner.go:195] Run: ls
	I0831 15:34:56.807343    3619 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:34:56.810430    3619 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:34:56.810441    3619 status.go:422] ha-949000-m02 apiserver status = Running (err=<nil>)
	I0831 15:34:56.810450    3619 status.go:257] ha-949000-m02 status: &{Name:ha-949000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:34:56.810466    3619 status.go:255] checking status of ha-949000-m03 ...
	I0831 15:34:56.810705    3619 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:56.810730    3619 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:56.819204    3619 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51633
	I0831 15:34:56.819558    3619 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:56.819890    3619 main.go:141] libmachine: Using API Version  1
	I0831 15:34:56.819900    3619 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:56.820122    3619 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:56.820222    3619 main.go:141] libmachine: (ha-949000-m03) Calling .GetState
	I0831 15:34:56.820308    3619 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:56.820375    3619 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:34:56.821364    3619 status.go:330] ha-949000-m03 host status = "Running" (err=<nil>)
	I0831 15:34:56.821374    3619 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:34:56.821625    3619 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:56.821650    3619 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:56.830281    3619 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51635
	I0831 15:34:56.830626    3619 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:56.830962    3619 main.go:141] libmachine: Using API Version  1
	I0831 15:34:56.830977    3619 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:56.831173    3619 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:56.831275    3619 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:34:56.831359    3619 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:34:56.831621    3619 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:56.831645    3619 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:56.839998    3619 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51637
	I0831 15:34:56.840366    3619 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:56.840685    3619 main.go:141] libmachine: Using API Version  1
	I0831 15:34:56.840694    3619 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:56.840912    3619 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:56.841027    3619 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:34:56.841184    3619 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:56.841196    3619 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:34:56.841286    3619 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:34:56.841361    3619 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:34:56.841480    3619 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:34:56.841555    3619 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:34:56.869326    3619 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:56.881014    3619 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:34:56.881029    3619 api_server.go:166] Checking apiserver status ...
	I0831 15:34:56.881072    3619 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:34:56.893160    3619 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1944/cgroup
	W0831 15:34:56.903337    3619 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1944/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:34:56.903394    3619 ssh_runner.go:195] Run: ls
	I0831 15:34:56.906544    3619 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:34:56.909596    3619 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:34:56.909613    3619 status.go:422] ha-949000-m03 apiserver status = Running (err=<nil>)
	I0831 15:34:56.909621    3619 status.go:257] ha-949000-m03 status: &{Name:ha-949000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:34:56.909634    3619 status.go:255] checking status of ha-949000-m04 ...
	I0831 15:34:56.909893    3619 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:56.909916    3619 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:56.918593    3619 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51641
	I0831 15:34:56.918962    3619 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:56.919323    3619 main.go:141] libmachine: Using API Version  1
	I0831 15:34:56.919334    3619 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:56.919565    3619 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:56.919674    3619 main.go:141] libmachine: (ha-949000-m04) Calling .GetState
	I0831 15:34:56.919765    3619 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:34:56.919838    3619 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3377
	I0831 15:34:56.920828    3619 status.go:330] ha-949000-m04 host status = "Running" (err=<nil>)
	I0831 15:34:56.920839    3619 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:34:56.921091    3619 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:56.921113    3619 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:56.929608    3619 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51643
	I0831 15:34:56.929958    3619 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:56.930326    3619 main.go:141] libmachine: Using API Version  1
	I0831 15:34:56.930344    3619 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:56.930541    3619 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:56.930640    3619 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:34:56.930717    3619 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:34:56.930977    3619 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:34:56.930999    3619 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:34:56.939549    3619 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51645
	I0831 15:34:56.939898    3619 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:34:56.940227    3619 main.go:141] libmachine: Using API Version  1
	I0831 15:34:56.940238    3619 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:34:56.940459    3619 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:34:56.940577    3619 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:34:56.940696    3619 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:34:56.940709    3619 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:34:56.940815    3619 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:34:56.940898    3619 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:34:56.940984    3619 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:34:56.941062    3619 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:34:56.977608    3619 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:34:56.989348    3619 status.go:257] ha-949000-m04 status: &{Name:ha-949000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr: exit status 2 (443.970971ms)

                                                
                                                
-- stdout --
	ha-949000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 15:35:07.219848    3637 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:35:07.220040    3637 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:35:07.220046    3637 out.go:358] Setting ErrFile to fd 2...
	I0831 15:35:07.220050    3637 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:35:07.220215    3637 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:35:07.220392    3637 out.go:352] Setting JSON to false
	I0831 15:35:07.220417    3637 mustload.go:65] Loading cluster: ha-949000
	I0831 15:35:07.220453    3637 notify.go:220] Checking for updates...
	I0831 15:35:07.220762    3637 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:35:07.220779    3637 status.go:255] checking status of ha-949000 ...
	I0831 15:35:07.221133    3637 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:07.221179    3637 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:07.230219    3637 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51649
	I0831 15:35:07.230634    3637 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:07.231077    3637 main.go:141] libmachine: Using API Version  1
	I0831 15:35:07.231086    3637 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:07.231291    3637 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:07.231395    3637 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:35:07.231475    3637 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:35:07.231544    3637 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:35:07.232482    3637 status.go:330] ha-949000 host status = "Running" (err=<nil>)
	I0831 15:35:07.232503    3637 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:35:07.232756    3637 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:07.232778    3637 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:07.241195    3637 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51651
	I0831 15:35:07.241536    3637 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:07.241928    3637 main.go:141] libmachine: Using API Version  1
	I0831 15:35:07.241958    3637 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:07.242150    3637 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:07.242254    3637 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:35:07.242329    3637 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:35:07.242607    3637 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:07.242631    3637 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:07.251055    3637 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51653
	I0831 15:35:07.251388    3637 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:07.251720    3637 main.go:141] libmachine: Using API Version  1
	I0831 15:35:07.251735    3637 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:07.251933    3637 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:07.252032    3637 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:35:07.252169    3637 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:35:07.252187    3637 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:35:07.252261    3637 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:35:07.252337    3637 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:35:07.252425    3637 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:35:07.252518    3637 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:35:07.290127    3637 ssh_runner.go:195] Run: systemctl --version
	I0831 15:35:07.294508    3637 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:35:07.305227    3637 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:35:07.305250    3637 api_server.go:166] Checking apiserver status ...
	I0831 15:35:07.305295    3637 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:35:07.316386    3637 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup
	W0831 15:35:07.323460    3637 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:35:07.323528    3637 ssh_runner.go:195] Run: ls
	I0831 15:35:07.327347    3637 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:35:07.330382    3637 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:35:07.330394    3637 status.go:422] ha-949000 apiserver status = Running (err=<nil>)
	I0831 15:35:07.330402    3637 status.go:257] ha-949000 status: &{Name:ha-949000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:35:07.330419    3637 status.go:255] checking status of ha-949000-m02 ...
	I0831 15:35:07.330659    3637 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:07.330678    3637 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:07.339505    3637 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51657
	I0831 15:35:07.339851    3637 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:07.340186    3637 main.go:141] libmachine: Using API Version  1
	I0831 15:35:07.340202    3637 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:07.340434    3637 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:07.340540    3637 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:35:07.340630    3637 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:35:07.340712    3637 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 3528
	I0831 15:35:07.341687    3637 status.go:330] ha-949000-m02 host status = "Running" (err=<nil>)
	I0831 15:35:07.341695    3637 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:35:07.341965    3637 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:07.341990    3637 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:07.350716    3637 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51659
	I0831 15:35:07.351051    3637 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:07.351399    3637 main.go:141] libmachine: Using API Version  1
	I0831 15:35:07.351413    3637 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:07.351613    3637 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:07.351721    3637 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:35:07.351806    3637 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:35:07.352064    3637 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:07.352093    3637 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:07.360736    3637 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51661
	I0831 15:35:07.361077    3637 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:07.361422    3637 main.go:141] libmachine: Using API Version  1
	I0831 15:35:07.361435    3637 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:07.361640    3637 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:07.361742    3637 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:35:07.361869    3637 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:35:07.361880    3637 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:35:07.361978    3637 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:35:07.362057    3637 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:35:07.362133    3637 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:35:07.362215    3637 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:35:07.391889    3637 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:35:07.403011    3637 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:35:07.403028    3637 api_server.go:166] Checking apiserver status ...
	I0831 15:35:07.403069    3637 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:35:07.415762    3637 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1981/cgroup
	W0831 15:35:07.423189    3637 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1981/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:35:07.423237    3637 ssh_runner.go:195] Run: ls
	I0831 15:35:07.426394    3637 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:35:07.429448    3637 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:35:07.429459    3637 status.go:422] ha-949000-m02 apiserver status = Running (err=<nil>)
	I0831 15:35:07.429466    3637 status.go:257] ha-949000-m02 status: &{Name:ha-949000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:35:07.429477    3637 status.go:255] checking status of ha-949000-m03 ...
	I0831 15:35:07.429756    3637 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:07.429775    3637 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:07.438258    3637 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51665
	I0831 15:35:07.438607    3637 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:07.438954    3637 main.go:141] libmachine: Using API Version  1
	I0831 15:35:07.438967    3637 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:07.439179    3637 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:07.439286    3637 main.go:141] libmachine: (ha-949000-m03) Calling .GetState
	I0831 15:35:07.439362    3637 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:35:07.439431    3637 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:35:07.440388    3637 status.go:330] ha-949000-m03 host status = "Running" (err=<nil>)
	I0831 15:35:07.440398    3637 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:35:07.440638    3637 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:07.440665    3637 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:07.449076    3637 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51667
	I0831 15:35:07.449416    3637 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:07.449753    3637 main.go:141] libmachine: Using API Version  1
	I0831 15:35:07.449766    3637 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:07.449975    3637 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:07.450079    3637 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:35:07.450155    3637 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:35:07.450412    3637 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:07.450433    3637 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:07.459054    3637 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51669
	I0831 15:35:07.459398    3637 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:07.459754    3637 main.go:141] libmachine: Using API Version  1
	I0831 15:35:07.459767    3637 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:07.459957    3637 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:07.460056    3637 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:35:07.460188    3637 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:35:07.460199    3637 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:35:07.460273    3637 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:35:07.460353    3637 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:35:07.460426    3637 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:35:07.460500    3637 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:35:07.488544    3637 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:35:07.501024    3637 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:35:07.501041    3637 api_server.go:166] Checking apiserver status ...
	I0831 15:35:07.501082    3637 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:35:07.512667    3637 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1944/cgroup
	W0831 15:35:07.521010    3637 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1944/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:35:07.521056    3637 ssh_runner.go:195] Run: ls
	I0831 15:35:07.524180    3637 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:35:07.527417    3637 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:35:07.527430    3637 status.go:422] ha-949000-m03 apiserver status = Running (err=<nil>)
	I0831 15:35:07.527439    3637 status.go:257] ha-949000-m03 status: &{Name:ha-949000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:35:07.527450    3637 status.go:255] checking status of ha-949000-m04 ...
	I0831 15:35:07.527721    3637 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:07.527742    3637 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:07.536779    3637 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51673
	I0831 15:35:07.537155    3637 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:07.537511    3637 main.go:141] libmachine: Using API Version  1
	I0831 15:35:07.537526    3637 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:07.537731    3637 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:07.537847    3637 main.go:141] libmachine: (ha-949000-m04) Calling .GetState
	I0831 15:35:07.537928    3637 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:35:07.538003    3637 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3377
	I0831 15:35:07.538977    3637 status.go:330] ha-949000-m04 host status = "Running" (err=<nil>)
	I0831 15:35:07.538988    3637 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:35:07.539240    3637 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:07.539264    3637 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:07.547822    3637 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51675
	I0831 15:35:07.548155    3637 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:07.548492    3637 main.go:141] libmachine: Using API Version  1
	I0831 15:35:07.548509    3637 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:07.548720    3637 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:07.548830    3637 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:35:07.548917    3637 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:35:07.549165    3637 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:07.549189    3637 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:07.557787    3637 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51677
	I0831 15:35:07.558131    3637 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:07.558494    3637 main.go:141] libmachine: Using API Version  1
	I0831 15:35:07.558511    3637 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:07.558723    3637 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:07.558825    3637 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:35:07.558949    3637 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:35:07.558960    3637 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:35:07.559024    3637 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:35:07.559109    3637 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:35:07.559198    3637 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:35:07.559276    3637 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:35:07.595277    3637 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:35:07.606874    3637 status.go:257] ha-949000-m04 status: &{Name:ha-949000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr: exit status 2 (446.425396ms)

                                                
                                                
-- stdout --
	ha-949000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 15:35:20.725448    3657 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:35:20.725635    3657 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:35:20.725641    3657 out.go:358] Setting ErrFile to fd 2...
	I0831 15:35:20.725645    3657 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:35:20.725825    3657 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:35:20.726035    3657 out.go:352] Setting JSON to false
	I0831 15:35:20.726059    3657 mustload.go:65] Loading cluster: ha-949000
	I0831 15:35:20.726095    3657 notify.go:220] Checking for updates...
	I0831 15:35:20.726375    3657 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:35:20.726391    3657 status.go:255] checking status of ha-949000 ...
	I0831 15:35:20.726764    3657 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:20.726821    3657 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:20.736100    3657 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51681
	I0831 15:35:20.736457    3657 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:20.736861    3657 main.go:141] libmachine: Using API Version  1
	I0831 15:35:20.736885    3657 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:20.737090    3657 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:20.737191    3657 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:35:20.737279    3657 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:35:20.737340    3657 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:35:20.738314    3657 status.go:330] ha-949000 host status = "Running" (err=<nil>)
	I0831 15:35:20.738339    3657 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:35:20.738585    3657 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:20.738609    3657 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:20.747136    3657 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51683
	I0831 15:35:20.747573    3657 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:20.747952    3657 main.go:141] libmachine: Using API Version  1
	I0831 15:35:20.747964    3657 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:20.748204    3657 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:20.748331    3657 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:35:20.748413    3657 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:35:20.748665    3657 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:20.748687    3657 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:20.757229    3657 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51685
	I0831 15:35:20.757552    3657 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:20.757902    3657 main.go:141] libmachine: Using API Version  1
	I0831 15:35:20.757915    3657 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:20.758134    3657 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:20.758253    3657 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:35:20.758402    3657 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:35:20.758420    3657 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:35:20.758502    3657 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:35:20.758583    3657 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:35:20.758663    3657 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:35:20.758747    3657 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:35:20.795528    3657 ssh_runner.go:195] Run: systemctl --version
	I0831 15:35:20.800117    3657 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:35:20.810752    3657 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:35:20.810776    3657 api_server.go:166] Checking apiserver status ...
	I0831 15:35:20.810820    3657 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:35:20.821691    3657 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup
	W0831 15:35:20.829828    3657 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:35:20.829881    3657 ssh_runner.go:195] Run: ls
	I0831 15:35:20.833353    3657 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:35:20.836780    3657 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:35:20.836794    3657 status.go:422] ha-949000 apiserver status = Running (err=<nil>)
	I0831 15:35:20.836804    3657 status.go:257] ha-949000 status: &{Name:ha-949000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:35:20.836817    3657 status.go:255] checking status of ha-949000-m02 ...
	I0831 15:35:20.837067    3657 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:20.837102    3657 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:20.845480    3657 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51689
	I0831 15:35:20.845814    3657 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:20.846124    3657 main.go:141] libmachine: Using API Version  1
	I0831 15:35:20.846135    3657 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:20.846352    3657 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:20.846455    3657 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:35:20.846543    3657 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:35:20.846618    3657 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 3528
	I0831 15:35:20.847596    3657 status.go:330] ha-949000-m02 host status = "Running" (err=<nil>)
	I0831 15:35:20.847607    3657 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:35:20.847862    3657 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:20.847889    3657 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:20.856334    3657 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51691
	I0831 15:35:20.856676    3657 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:20.857010    3657 main.go:141] libmachine: Using API Version  1
	I0831 15:35:20.857024    3657 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:20.857252    3657 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:20.857364    3657 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:35:20.857445    3657 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:35:20.857713    3657 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:20.857736    3657 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:20.866242    3657 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51693
	I0831 15:35:20.866602    3657 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:20.866950    3657 main.go:141] libmachine: Using API Version  1
	I0831 15:35:20.866964    3657 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:20.867186    3657 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:20.867283    3657 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:35:20.867411    3657 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:35:20.867422    3657 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:35:20.867493    3657 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:35:20.867571    3657 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:35:20.867647    3657 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:35:20.867739    3657 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:35:20.899671    3657 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:35:20.911245    3657 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:35:20.911259    3657 api_server.go:166] Checking apiserver status ...
	I0831 15:35:20.911297    3657 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:35:20.922527    3657 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1981/cgroup
	W0831 15:35:20.930271    3657 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1981/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:35:20.930317    3657 ssh_runner.go:195] Run: ls
	I0831 15:35:20.933847    3657 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:35:20.936867    3657 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:35:20.936879    3657 status.go:422] ha-949000-m02 apiserver status = Running (err=<nil>)
	I0831 15:35:20.936887    3657 status.go:257] ha-949000-m02 status: &{Name:ha-949000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:35:20.936896    3657 status.go:255] checking status of ha-949000-m03 ...
	I0831 15:35:20.937151    3657 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:20.937177    3657 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:20.945721    3657 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51697
	I0831 15:35:20.946060    3657 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:20.946383    3657 main.go:141] libmachine: Using API Version  1
	I0831 15:35:20.946393    3657 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:20.946616    3657 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:20.946737    3657 main.go:141] libmachine: (ha-949000-m03) Calling .GetState
	I0831 15:35:20.946843    3657 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:35:20.946938    3657 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:35:20.947916    3657 status.go:330] ha-949000-m03 host status = "Running" (err=<nil>)
	I0831 15:35:20.947927    3657 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:35:20.948175    3657 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:20.948201    3657 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:20.956738    3657 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51699
	I0831 15:35:20.957086    3657 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:20.957443    3657 main.go:141] libmachine: Using API Version  1
	I0831 15:35:20.957460    3657 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:20.957681    3657 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:20.957788    3657 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:35:20.957876    3657 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:35:20.958133    3657 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:20.958155    3657 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:20.966646    3657 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51701
	I0831 15:35:20.966979    3657 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:20.967286    3657 main.go:141] libmachine: Using API Version  1
	I0831 15:35:20.967296    3657 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:20.967513    3657 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:20.967620    3657 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:35:20.967730    3657 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:35:20.967742    3657 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:35:20.967821    3657 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:35:20.967921    3657 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:35:20.968007    3657 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:35:20.968086    3657 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:35:20.996788    3657 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:35:21.008524    3657 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:35:21.008541    3657 api_server.go:166] Checking apiserver status ...
	I0831 15:35:21.008581    3657 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:35:21.019901    3657 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1944/cgroup
	W0831 15:35:21.028343    3657 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1944/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:35:21.028389    3657 ssh_runner.go:195] Run: ls
	I0831 15:35:21.031817    3657 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:35:21.034940    3657 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:35:21.034953    3657 status.go:422] ha-949000-m03 apiserver status = Running (err=<nil>)
	I0831 15:35:21.034961    3657 status.go:257] ha-949000-m03 status: &{Name:ha-949000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:35:21.034971    3657 status.go:255] checking status of ha-949000-m04 ...
	I0831 15:35:21.035211    3657 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:21.035235    3657 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:21.043869    3657 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51705
	I0831 15:35:21.044222    3657 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:21.044539    3657 main.go:141] libmachine: Using API Version  1
	I0831 15:35:21.044557    3657 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:21.044760    3657 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:21.044878    3657 main.go:141] libmachine: (ha-949000-m04) Calling .GetState
	I0831 15:35:21.044970    3657 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:35:21.045043    3657 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3377
	I0831 15:35:21.046004    3657 status.go:330] ha-949000-m04 host status = "Running" (err=<nil>)
	I0831 15:35:21.046014    3657 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:35:21.046262    3657 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:21.046305    3657 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:21.054814    3657 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51707
	I0831 15:35:21.055152    3657 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:21.055457    3657 main.go:141] libmachine: Using API Version  1
	I0831 15:35:21.055470    3657 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:21.055677    3657 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:21.055780    3657 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:35:21.055864    3657 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:35:21.056109    3657 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:21.056129    3657 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:21.064488    3657 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51709
	I0831 15:35:21.064820    3657 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:21.065133    3657 main.go:141] libmachine: Using API Version  1
	I0831 15:35:21.065144    3657 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:21.065356    3657 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:21.065474    3657 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:35:21.065601    3657 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:35:21.065621    3657 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:35:21.065702    3657 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:35:21.065782    3657 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:35:21.065871    3657 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:35:21.065945    3657 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:35:21.102760    3657 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:35:21.114455    3657 status.go:257] ha-949000-m04 status: &{Name:ha-949000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr: exit status 2 (449.054158ms)

                                                
                                                
-- stdout --
	ha-949000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 15:35:32.745290    3677 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:35:32.745581    3677 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:35:32.745587    3677 out.go:358] Setting ErrFile to fd 2...
	I0831 15:35:32.745591    3677 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:35:32.745761    3677 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:35:32.745940    3677 out.go:352] Setting JSON to false
	I0831 15:35:32.745962    3677 mustload.go:65] Loading cluster: ha-949000
	I0831 15:35:32.746014    3677 notify.go:220] Checking for updates...
	I0831 15:35:32.746269    3677 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:35:32.746285    3677 status.go:255] checking status of ha-949000 ...
	I0831 15:35:32.746659    3677 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:32.746709    3677 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:32.755848    3677 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51713
	I0831 15:35:32.756246    3677 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:32.756663    3677 main.go:141] libmachine: Using API Version  1
	I0831 15:35:32.756679    3677 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:32.756953    3677 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:32.757079    3677 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:35:32.757184    3677 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:35:32.757261    3677 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:35:32.758236    3677 status.go:330] ha-949000 host status = "Running" (err=<nil>)
	I0831 15:35:32.758259    3677 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:35:32.758518    3677 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:32.758555    3677 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:32.767136    3677 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51715
	I0831 15:35:32.767456    3677 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:32.767809    3677 main.go:141] libmachine: Using API Version  1
	I0831 15:35:32.767823    3677 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:32.768033    3677 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:32.768145    3677 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:35:32.768239    3677 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:35:32.768481    3677 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:32.768505    3677 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:32.776943    3677 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51717
	I0831 15:35:32.777252    3677 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:32.777596    3677 main.go:141] libmachine: Using API Version  1
	I0831 15:35:32.777611    3677 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:32.777808    3677 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:32.777901    3677 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:35:32.778037    3677 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:35:32.778057    3677 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:35:32.778132    3677 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:35:32.778202    3677 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:35:32.778270    3677 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:35:32.778341    3677 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:35:32.816203    3677 ssh_runner.go:195] Run: systemctl --version
	I0831 15:35:32.820549    3677 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:35:32.831218    3677 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:35:32.831242    3677 api_server.go:166] Checking apiserver status ...
	I0831 15:35:32.831281    3677 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:35:32.842378    3677 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup
	W0831 15:35:32.849825    3677 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2000/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:35:32.849875    3677 ssh_runner.go:195] Run: ls
	I0831 15:35:32.853208    3677 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:35:32.856402    3677 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:35:32.856417    3677 status.go:422] ha-949000 apiserver status = Running (err=<nil>)
	I0831 15:35:32.856426    3677 status.go:257] ha-949000 status: &{Name:ha-949000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:35:32.856437    3677 status.go:255] checking status of ha-949000-m02 ...
	I0831 15:35:32.856699    3677 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:32.856724    3677 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:32.865372    3677 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51721
	I0831 15:35:32.865734    3677 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:32.866086    3677 main.go:141] libmachine: Using API Version  1
	I0831 15:35:32.866098    3677 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:32.866335    3677 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:32.866443    3677 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:35:32.866541    3677 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:35:32.866612    3677 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 3528
	I0831 15:35:32.867603    3677 status.go:330] ha-949000-m02 host status = "Running" (err=<nil>)
	I0831 15:35:32.867614    3677 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:35:32.867863    3677 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:32.867900    3677 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:32.876846    3677 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51723
	I0831 15:35:32.877189    3677 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:32.877499    3677 main.go:141] libmachine: Using API Version  1
	I0831 15:35:32.877511    3677 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:32.877721    3677 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:32.877822    3677 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:35:32.877904    3677 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:35:32.878159    3677 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:32.878187    3677 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:32.886705    3677 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51725
	I0831 15:35:32.887028    3677 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:32.887344    3677 main.go:141] libmachine: Using API Version  1
	I0831 15:35:32.887354    3677 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:32.887595    3677 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:32.887699    3677 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:35:32.887846    3677 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:35:32.887858    3677 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:35:32.887935    3677 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:35:32.888009    3677 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:35:32.888089    3677 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:35:32.888163    3677 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:35:32.919773    3677 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:35:32.931214    3677 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:35:32.931227    3677 api_server.go:166] Checking apiserver status ...
	I0831 15:35:32.931268    3677 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:35:32.942675    3677 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1981/cgroup
	W0831 15:35:32.950937    3677 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1981/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:35:32.950984    3677 ssh_runner.go:195] Run: ls
	I0831 15:35:32.954522    3677 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:35:32.957576    3677 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:35:32.957587    3677 status.go:422] ha-949000-m02 apiserver status = Running (err=<nil>)
	I0831 15:35:32.957594    3677 status.go:257] ha-949000-m02 status: &{Name:ha-949000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:35:32.957605    3677 status.go:255] checking status of ha-949000-m03 ...
	I0831 15:35:32.957863    3677 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:32.957895    3677 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:32.966492    3677 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51729
	I0831 15:35:32.966845    3677 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:32.967154    3677 main.go:141] libmachine: Using API Version  1
	I0831 15:35:32.967164    3677 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:32.967368    3677 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:32.967480    3677 main.go:141] libmachine: (ha-949000-m03) Calling .GetState
	I0831 15:35:32.967557    3677 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:35:32.967629    3677 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:35:32.968580    3677 status.go:330] ha-949000-m03 host status = "Running" (err=<nil>)
	I0831 15:35:32.968591    3677 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:35:32.968838    3677 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:32.968874    3677 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:32.977333    3677 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51731
	I0831 15:35:32.977693    3677 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:32.978044    3677 main.go:141] libmachine: Using API Version  1
	I0831 15:35:32.978058    3677 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:32.978285    3677 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:32.978393    3677 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:35:32.978483    3677 host.go:66] Checking if "ha-949000-m03" exists ...
	I0831 15:35:32.978729    3677 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:32.978752    3677 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:32.987159    3677 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51733
	I0831 15:35:32.987515    3677 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:32.987877    3677 main.go:141] libmachine: Using API Version  1
	I0831 15:35:32.987894    3677 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:32.988096    3677 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:32.988226    3677 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:35:32.988366    3677 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:35:32.988377    3677 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:35:32.988468    3677 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:35:32.988568    3677 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:35:32.988660    3677 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:35:32.988735    3677 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:35:33.017175    3677 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:35:33.028886    3677 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:35:33.028900    3677 api_server.go:166] Checking apiserver status ...
	I0831 15:35:33.028938    3677 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:35:33.040638    3677 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1944/cgroup
	W0831 15:35:33.050639    3677 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1944/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:35:33.050694    3677 ssh_runner.go:195] Run: ls
	I0831 15:35:33.054066    3677 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:35:33.057089    3677 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:35:33.057101    3677 status.go:422] ha-949000-m03 apiserver status = Running (err=<nil>)
	I0831 15:35:33.057109    3677 status.go:257] ha-949000-m03 status: &{Name:ha-949000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:35:33.057119    3677 status.go:255] checking status of ha-949000-m04 ...
	I0831 15:35:33.057365    3677 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:33.057385    3677 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:33.065994    3677 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51737
	I0831 15:35:33.066362    3677 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:33.066703    3677 main.go:141] libmachine: Using API Version  1
	I0831 15:35:33.066719    3677 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:33.066921    3677 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:33.067028    3677 main.go:141] libmachine: (ha-949000-m04) Calling .GetState
	I0831 15:35:33.067118    3677 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:35:33.067191    3677 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3377
	I0831 15:35:33.068160    3677 status.go:330] ha-949000-m04 host status = "Running" (err=<nil>)
	I0831 15:35:33.068169    3677 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:35:33.068424    3677 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:33.068450    3677 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:33.077048    3677 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51739
	I0831 15:35:33.077385    3677 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:33.077702    3677 main.go:141] libmachine: Using API Version  1
	I0831 15:35:33.077712    3677 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:33.077911    3677 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:33.078006    3677 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:35:33.078082    3677 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:35:33.078335    3677 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:35:33.078358    3677 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:35:33.086924    3677 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51741
	I0831 15:35:33.087254    3677 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:35:33.087626    3677 main.go:141] libmachine: Using API Version  1
	I0831 15:35:33.087643    3677 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:35:33.087857    3677 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:35:33.087977    3677 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:35:33.088109    3677 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:35:33.088121    3677 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:35:33.088207    3677 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:35:33.088299    3677 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:35:33.088388    3677 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:35:33.088473    3677 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:35:33.124805    3677 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:35:33.136525    3677 status.go:257] ha-949000-m04 status: &{Name:ha-949000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:432: failed to run minikube status. args "out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr" : exit status 2
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-949000 -n ha-949000
helpers_test.go:245: <<< TestMultiControlPlane/serial/RestartSecondaryNode FAILED: start of post-mortem logs <<<
helpers_test.go:246: ======>  post-mortem[TestMultiControlPlane/serial/RestartSecondaryNode]: minikube logs <======
helpers_test.go:248: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 logs -n 25
helpers_test.go:248: (dbg) Done: out/minikube-darwin-amd64 -p ha-949000 logs -n 25: (2.396460735s)
helpers_test.go:253: TestMultiControlPlane/serial/RestartSecondaryNode logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |      Profile      |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	| delete  | -p functional-593000                 | functional-593000 | jenkins | v1.33.1 | 31 Aug 24 15:29 PDT | 31 Aug 24 15:29 PDT |
	| start   | -p ha-949000 --wait=true             | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:29 PDT | 31 Aug 24 15:32 PDT |
	|         | --memory=2200 --ha                   |                   |         |         |                     |                     |
	|         | -v=7 --alsologtostderr               |                   |         |         |                     |                     |
	|         | --driver=hyperkit                    |                   |         |         |                     |                     |
	| kubectl | -p ha-949000 -- apply -f             | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | ./testdata/ha/ha-pod-dns-test.yaml   |                   |         |         |                     |                     |
	| kubectl | -p ha-949000 -- rollout status       | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | deployment/busybox                   |                   |         |         |                     |                     |
	| kubectl | -p ha-949000 -- get pods -o          | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | jsonpath='{.items[*].status.podIP}'  |                   |         |         |                     |                     |
	| kubectl | -p ha-949000 -- get pods -o          | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |                   |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5 --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5 --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw -- nslookup  |                   |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5 -- nslookup  |                   |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x -- nslookup  |                   |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl | -p ha-949000 -- get pods -o          | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |                   |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw              |                   |         |         |                     |                     |
	|         | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|         | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw -- sh        |                   |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5              |                   |         |         |                     |                     |
	|         | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|         | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5 -- sh        |                   |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x              |                   |         |         |                     |                     |
	|         | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|         | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x -- sh        |                   |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| node    | add -p ha-949000 -v=7                | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT |                     |
	|         | --alsologtostderr                    |                   |         |         |                     |                     |
	| node    | ha-949000 node stop m02 -v=7         | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:33 PDT | 31 Aug 24 15:33 PDT |
	|         | --alsologtostderr                    |                   |         |         |                     |                     |
	| node    | ha-949000 node start m02 -v=7        | ha-949000         | jenkins | v1.33.1 | 31 Aug 24 15:34 PDT | 31 Aug 24 15:34 PDT |
	|         | --alsologtostderr                    |                   |         |         |                     |                     |
	|---------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/08/31 15:29:09
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0831 15:29:09.276641    2876 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:29:09.276909    2876 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:29:09.276915    2876 out.go:358] Setting ErrFile to fd 2...
	I0831 15:29:09.276919    2876 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:29:09.277077    2876 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:29:09.278657    2876 out.go:352] Setting JSON to false
	I0831 15:29:09.304076    2876 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":1720,"bootTime":1725141629,"procs":442,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.6.1","kernelVersion":"23.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0831 15:29:09.304206    2876 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0831 15:29:09.363205    2876 out.go:177] * [ha-949000] minikube v1.33.1 on Darwin 14.6.1
	I0831 15:29:09.404287    2876 notify.go:220] Checking for updates...
	I0831 15:29:09.428120    2876 out.go:177]   - MINIKUBE_LOCATION=18943
	I0831 15:29:09.489040    2876 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:29:09.566857    2876 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0831 15:29:09.611464    2876 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0831 15:29:09.632356    2876 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:29:09.653358    2876 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0831 15:29:09.674652    2876 driver.go:392] Setting default libvirt URI to qemu:///system
	I0831 15:29:09.704277    2876 out.go:177] * Using the hyperkit driver based on user configuration
	I0831 15:29:09.746520    2876 start.go:297] selected driver: hyperkit
	I0831 15:29:09.746549    2876 start.go:901] validating driver "hyperkit" against <nil>
	I0831 15:29:09.746572    2876 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0831 15:29:09.750947    2876 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:29:09.751059    2876 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18943-957/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0831 15:29:09.759462    2876 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0831 15:29:09.763334    2876 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:09.763355    2876 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0831 15:29:09.763386    2876 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0831 15:29:09.763603    2876 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:29:09.763661    2876 cni.go:84] Creating CNI manager for ""
	I0831 15:29:09.763670    2876 cni.go:136] multinode detected (0 nodes found), recommending kindnet
	I0831 15:29:09.763676    2876 start_flags.go:319] Found "CNI" CNI - setting NetworkPlugin=cni
	I0831 15:29:09.763757    2876 start.go:340] cluster config:
	{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docke
r CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0
GPUs: AutoPauseInterval:1m0s}
	I0831 15:29:09.763847    2876 iso.go:125] acquiring lock: {Name:mk6e91575b208577856769ef01f8e000bc57c787 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:29:09.806188    2876 out.go:177] * Starting "ha-949000" primary control-plane node in "ha-949000" cluster
	I0831 15:29:09.827330    2876 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:29:09.827400    2876 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4
	I0831 15:29:09.827429    2876 cache.go:56] Caching tarball of preloaded images
	I0831 15:29:09.827640    2876 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:29:09.827663    2876 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:29:09.828200    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:29:09.828242    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json: {Name:mka3af2c42dba1cbf0f487cd55ddf735793024ce Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:09.828849    2876 start.go:360] acquireMachinesLock for ha-949000: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:29:09.828952    2876 start.go:364] duration metric: took 84.577µs to acquireMachinesLock for "ha-949000"
	I0831 15:29:09.828988    2876 start.go:93] Provisioning new machine with config: &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType
:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:29:09.829059    2876 start.go:125] createHost starting for "" (driver="hyperkit")
	I0831 15:29:09.903354    2876 out.go:235] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0831 15:29:09.903628    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:09.903698    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:09.913643    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51029
	I0831 15:29:09.913991    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:09.914387    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:09.914395    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:09.914636    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:09.914768    2876 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:29:09.914873    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:09.915000    2876 start.go:159] libmachine.API.Create for "ha-949000" (driver="hyperkit")
	I0831 15:29:09.915023    2876 client.go:168] LocalClient.Create starting
	I0831 15:29:09.915061    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem
	I0831 15:29:09.915112    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:29:09.915129    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:29:09.915188    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem
	I0831 15:29:09.915229    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:29:09.915249    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:29:09.915265    2876 main.go:141] libmachine: Running pre-create checks...
	I0831 15:29:09.915270    2876 main.go:141] libmachine: (ha-949000) Calling .PreCreateCheck
	I0831 15:29:09.915359    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:09.915528    2876 main.go:141] libmachine: (ha-949000) Calling .GetConfigRaw
	I0831 15:29:09.915949    2876 main.go:141] libmachine: Creating machine...
	I0831 15:29:09.915958    2876 main.go:141] libmachine: (ha-949000) Calling .Create
	I0831 15:29:09.916028    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:09.916144    2876 main.go:141] libmachine: (ha-949000) DBG | I0831 15:29:09.916024    2884 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:29:09.916224    2876 main.go:141] libmachine: (ha-949000) Downloading /Users/jenkins/minikube-integration/18943-957/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso...
	I0831 15:29:10.099863    2876 main.go:141] libmachine: (ha-949000) DBG | I0831 15:29:10.099790    2884 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa...
	I0831 15:29:10.256390    2876 main.go:141] libmachine: (ha-949000) DBG | I0831 15:29:10.256317    2884 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk...
	I0831 15:29:10.256437    2876 main.go:141] libmachine: (ha-949000) DBG | Writing magic tar header
	I0831 15:29:10.256445    2876 main.go:141] libmachine: (ha-949000) DBG | Writing SSH key tar header
	I0831 15:29:10.257253    2876 main.go:141] libmachine: (ha-949000) DBG | I0831 15:29:10.257126    2884 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000 ...
	I0831 15:29:10.614937    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:10.614967    2876 main.go:141] libmachine: (ha-949000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid
	I0831 15:29:10.615070    2876 main.go:141] libmachine: (ha-949000) DBG | Using UUID 98cab9ba-901d-49d1-9e6c-321a4533d56e
	I0831 15:29:10.724629    2876 main.go:141] libmachine: (ha-949000) DBG | Generated MAC ce:8:77:f7:42:5e
	I0831 15:29:10.724653    2876 main.go:141] libmachine: (ha-949000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:29:10.724744    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"98cab9ba-901d-49d1-9e6c-321a4533d56e", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001ae630)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:29:10.724785    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"98cab9ba-901d-49d1-9e6c-321a4533d56e", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001ae630)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:29:10.724823    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "98cab9ba-901d-49d1-9e6c-321a4533d56e", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd,earlyprintk=serial l
oglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:29:10.724851    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 98cab9ba-901d-49d1-9e6c-321a4533d56e -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset noresto
re waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:29:10.724862    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:29:10.727687    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 DEBUG: hyperkit: Pid is 2887
	I0831 15:29:10.728136    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 0
	I0831 15:29:10.728145    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:10.728201    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:10.729180    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:10.729276    2876 main.go:141] libmachine: (ha-949000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0831 15:29:10.729293    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:10.729309    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:10.729317    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:10.735289    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:29:10.788351    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:29:10.788955    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:29:10.788972    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:29:10.788980    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:29:10.788989    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:29:11.164652    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:29:11.164668    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:29:11.279214    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:29:11.279233    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:29:11.279245    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:29:11.279263    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:29:11.280165    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:29:11.280176    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:11 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:29:12.729552    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 1
	I0831 15:29:12.729568    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:12.729694    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:12.730495    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:12.730552    2876 main.go:141] libmachine: (ha-949000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0831 15:29:12.730566    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:12.730580    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:12.730595    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:14.731472    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 2
	I0831 15:29:14.731486    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:14.731548    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:14.732412    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:14.732458    2876 main.go:141] libmachine: (ha-949000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0831 15:29:14.732473    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:14.732492    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:14.732506    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:16.732786    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 3
	I0831 15:29:16.732802    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:16.732855    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:16.733685    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:16.733713    2876 main.go:141] libmachine: (ha-949000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0831 15:29:16.733721    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:16.733748    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:16.733759    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:16.839902    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:16 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:29:16.839946    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:16 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:29:16.839959    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:16 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:29:16.864989    2876 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:29:16 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:29:18.735154    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 4
	I0831 15:29:18.735170    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:18.735286    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:18.736038    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:18.736084    2876 main.go:141] libmachine: (ha-949000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0831 15:29:18.736094    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:18.736103    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:18.736112    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:20.736683    2876 main.go:141] libmachine: (ha-949000) DBG | Attempt 5
	I0831 15:29:20.736698    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:20.736791    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:20.737588    2876 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:29:20.737620    2876 main.go:141] libmachine: (ha-949000) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:20.737633    2876 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:20.737640    2876 main.go:141] libmachine: (ha-949000) DBG | Found match: ce:8:77:f7:42:5e
	I0831 15:29:20.737645    2876 main.go:141] libmachine: (ha-949000) DBG | IP: 192.169.0.5
	I0831 15:29:20.737694    2876 main.go:141] libmachine: (ha-949000) Calling .GetConfigRaw
	I0831 15:29:20.738300    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:20.738400    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:20.738493    2876 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0831 15:29:20.738503    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:29:20.738582    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:20.738639    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:20.739400    2876 main.go:141] libmachine: Detecting operating system of created instance...
	I0831 15:29:20.739409    2876 main.go:141] libmachine: Waiting for SSH to be available...
	I0831 15:29:20.739415    2876 main.go:141] libmachine: Getting to WaitForSSH function...
	I0831 15:29:20.739420    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:20.739500    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:20.739608    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:20.739694    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:20.739784    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:20.739906    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:20.740082    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:20.740088    2876 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0831 15:29:21.810169    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:29:21.810183    2876 main.go:141] libmachine: Detecting the provisioner...
	I0831 15:29:21.810190    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:21.810319    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:21.810409    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.810520    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.810622    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:21.810753    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:21.810899    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:21.810907    2876 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0831 15:29:21.876064    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0831 15:29:21.876103    2876 main.go:141] libmachine: found compatible host: buildroot
	I0831 15:29:21.876110    2876 main.go:141] libmachine: Provisioning with buildroot...
	I0831 15:29:21.876116    2876 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:29:21.876252    2876 buildroot.go:166] provisioning hostname "ha-949000"
	I0831 15:29:21.876263    2876 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:29:21.876353    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:21.876438    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:21.876542    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.876625    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.876705    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:21.876835    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:21.876977    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:21.876986    2876 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000 && echo "ha-949000" | sudo tee /etc/hostname
	I0831 15:29:21.955731    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000
	
	I0831 15:29:21.955752    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:21.955889    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:21.955998    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.956098    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:21.956196    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:21.956332    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:21.956482    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:21.956494    2876 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:29:22.031652    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:29:22.031674    2876 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:29:22.031695    2876 buildroot.go:174] setting up certificates
	I0831 15:29:22.031704    2876 provision.go:84] configureAuth start
	I0831 15:29:22.031711    2876 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:29:22.031840    2876 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:29:22.031922    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:22.032006    2876 provision.go:143] copyHostCerts
	I0831 15:29:22.032046    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:29:22.032109    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:29:22.032118    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:29:22.032257    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:29:22.032465    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:29:22.032502    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:29:22.032507    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:29:22.032592    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:29:22.032752    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:29:22.032790    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:29:22.032795    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:29:22.032874    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:29:22.033015    2876 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000 san=[127.0.0.1 192.169.0.5 ha-949000 localhost minikube]
	I0831 15:29:22.113278    2876 provision.go:177] copyRemoteCerts
	I0831 15:29:22.113334    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:29:22.113349    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:22.113477    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:22.113572    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.113653    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:22.113746    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:22.153055    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:29:22.153132    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:29:22.173186    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:29:22.173254    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1196 bytes)
	I0831 15:29:22.192526    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:29:22.192581    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0831 15:29:22.212150    2876 provision.go:87] duration metric: took 180.428736ms to configureAuth
	I0831 15:29:22.212163    2876 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:29:22.212301    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:29:22.212314    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:22.212441    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:22.212522    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:22.212600    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.212680    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.212760    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:22.212882    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:22.213008    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:22.213015    2876 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:29:22.281023    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:29:22.281035    2876 buildroot.go:70] root file system type: tmpfs
	I0831 15:29:22.281108    2876 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:29:22.281121    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:22.281265    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:22.281355    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.281474    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.281559    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:22.281695    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:22.281836    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:22.281881    2876 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:29:22.358523    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:29:22.358550    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:22.358687    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:22.358785    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.358873    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:22.358967    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:22.359137    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:22.359281    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:22.359293    2876 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:29:23.900860    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:29:23.900883    2876 main.go:141] libmachine: Checking connection to Docker...
	I0831 15:29:23.900890    2876 main.go:141] libmachine: (ha-949000) Calling .GetURL
	I0831 15:29:23.901027    2876 main.go:141] libmachine: Docker is up and running!
	I0831 15:29:23.901035    2876 main.go:141] libmachine: Reticulating splines...
	I0831 15:29:23.901040    2876 client.go:171] duration metric: took 13.985813631s to LocalClient.Create
	I0831 15:29:23.901051    2876 start.go:167] duration metric: took 13.985855387s to libmachine.API.Create "ha-949000"
	I0831 15:29:23.901061    2876 start.go:293] postStartSetup for "ha-949000" (driver="hyperkit")
	I0831 15:29:23.901070    2876 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:29:23.901080    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:23.901239    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:29:23.901251    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:23.901337    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:23.901438    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:23.901525    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:23.901622    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:23.947237    2876 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:29:23.951946    2876 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:29:23.951965    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:29:23.952069    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:29:23.952248    2876 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:29:23.952255    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:29:23.952462    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:29:23.961814    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:29:23.990864    2876 start.go:296] duration metric: took 89.791408ms for postStartSetup
	I0831 15:29:23.990895    2876 main.go:141] libmachine: (ha-949000) Calling .GetConfigRaw
	I0831 15:29:23.991499    2876 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:29:23.991642    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:29:23.991961    2876 start.go:128] duration metric: took 14.162686523s to createHost
	I0831 15:29:23.991974    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:23.992084    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:23.992175    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:23.992259    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:23.992348    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:23.992457    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:29:23.992584    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:29:23.992591    2876 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:29:24.059500    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143363.867477750
	
	I0831 15:29:24.059512    2876 fix.go:216] guest clock: 1725143363.867477750
	I0831 15:29:24.059517    2876 fix.go:229] Guest: 2024-08-31 15:29:23.86747775 -0700 PDT Remote: 2024-08-31 15:29:23.991969 -0700 PDT m=+14.752935961 (delta=-124.49125ms)
	I0831 15:29:24.059536    2876 fix.go:200] guest clock delta is within tolerance: -124.49125ms
	I0831 15:29:24.059546    2876 start.go:83] releasing machines lock for "ha-949000", held for 14.230377343s
	I0831 15:29:24.059565    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:24.059706    2876 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:29:24.059819    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:24.060132    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:24.060244    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:24.060319    2876 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:29:24.060346    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:24.060384    2876 ssh_runner.go:195] Run: cat /version.json
	I0831 15:29:24.060396    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:24.060439    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:24.060498    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:24.060525    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:24.060623    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:24.060654    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:24.060746    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:24.060765    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:24.060837    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:24.096035    2876 ssh_runner.go:195] Run: systemctl --version
	I0831 15:29:24.148302    2876 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0831 15:29:24.153275    2876 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:29:24.153315    2876 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:29:24.165840    2876 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:29:24.165854    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:29:24.165972    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:29:24.181258    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:29:24.191149    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:29:24.200150    2876 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:29:24.200197    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:29:24.209198    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:29:24.217930    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:29:24.227002    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:29:24.237048    2876 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:29:24.246383    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:29:24.255322    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:29:24.264369    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:29:24.273487    2876 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:29:24.282138    2876 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:29:24.290220    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:24.385700    2876 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:29:24.407032    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:29:24.407111    2876 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:29:24.421439    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:29:24.437414    2876 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:29:24.451401    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:29:24.463382    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:29:24.474406    2876 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:29:24.507277    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:29:24.517707    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:29:24.532548    2876 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:29:24.535464    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:29:24.542699    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:29:24.557395    2876 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:29:24.662440    2876 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:29:24.769422    2876 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:29:24.769500    2876 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:29:24.784888    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:24.881202    2876 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:29:27.276172    2876 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.394917578s)
	I0831 15:29:27.276233    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:29:27.287739    2876 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:29:27.301676    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:29:27.312754    2876 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:29:27.407771    2876 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:29:27.503429    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:27.614933    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:29:27.628621    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:29:27.641141    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:27.759998    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:29:27.816359    2876 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:29:27.816437    2876 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:29:27.820881    2876 start.go:563] Will wait 60s for crictl version
	I0831 15:29:27.820929    2876 ssh_runner.go:195] Run: which crictl
	I0831 15:29:27.824109    2876 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:29:27.852863    2876 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:29:27.852937    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:29:27.870865    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:29:27.937728    2876 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:29:27.937791    2876 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:29:27.938219    2876 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:29:27.943196    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:29:27.954353    2876 kubeadm.go:883] updating cluster {Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.
0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 Moun
tType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0831 15:29:27.954419    2876 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:29:27.954480    2876 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 15:29:27.967028    2876 docker.go:685] Got preloaded images: 
	I0831 15:29:27.967040    2876 docker.go:691] registry.k8s.io/kube-apiserver:v1.31.0 wasn't preloaded
	I0831 15:29:27.967094    2876 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0831 15:29:27.975409    2876 ssh_runner.go:195] Run: which lz4
	I0831 15:29:27.978323    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 -> /preloaded.tar.lz4
	I0831 15:29:27.978434    2876 ssh_runner.go:195] Run: stat -c "%s %y" /preloaded.tar.lz4
	I0831 15:29:27.981530    2876 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%s %y" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0831 15:29:27.981546    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (342554258 bytes)
	I0831 15:29:28.829399    2876 docker.go:649] duration metric: took 850.988233ms to copy over tarball
	I0831 15:29:28.829466    2876 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I0831 15:29:31.094292    2876 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.264775779s)
	I0831 15:29:31.094306    2876 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0831 15:29:31.120523    2876 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0831 15:29:31.129444    2876 ssh_runner.go:362] scp memory --> /var/lib/docker/image/overlay2/repositories.json (2631 bytes)
	I0831 15:29:31.144462    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:31.255144    2876 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:29:33.625508    2876 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.370311255s)
	I0831 15:29:33.625595    2876 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 15:29:33.642024    2876 docker.go:685] Got preloaded images: -- stdout --
	registry.k8s.io/kube-controller-manager:v1.31.0
	registry.k8s.io/kube-scheduler:v1.31.0
	registry.k8s.io/kube-apiserver:v1.31.0
	registry.k8s.io/kube-proxy:v1.31.0
	registry.k8s.io/etcd:3.5.15-0
	registry.k8s.io/pause:3.10
	registry.k8s.io/coredns/coredns:v1.11.1
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0831 15:29:33.642043    2876 cache_images.go:84] Images are preloaded, skipping loading
	I0831 15:29:33.642059    2876 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.31.0 docker true true} ...
	I0831 15:29:33.642140    2876 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:29:33.642205    2876 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0831 15:29:33.687213    2876 cni.go:84] Creating CNI manager for ""
	I0831 15:29:33.687227    2876 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0831 15:29:33.687238    2876 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0831 15:29:33.687253    2876 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.31.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-949000 NodeName:ha-949000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0831 15:29:33.687355    2876 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-949000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.31.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0831 15:29:33.687380    2876 kube-vip.go:115] generating kube-vip config ...
	I0831 15:29:33.687436    2876 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:29:33.701609    2876 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:29:33.701679    2876 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/super-admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:29:33.701731    2876 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:29:33.709907    2876 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:29:33.709972    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0831 15:29:33.717287    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0831 15:29:33.730443    2876 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:29:33.743765    2876 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0831 15:29:33.758082    2876 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1446 bytes)
	I0831 15:29:33.771561    2876 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:29:33.774412    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:29:33.783869    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:29:33.875944    2876 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:29:33.891425    2876 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.5
	I0831 15:29:33.891438    2876 certs.go:194] generating shared ca certs ...
	I0831 15:29:33.891448    2876 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:33.891633    2876 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:29:33.891710    2876 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:29:33.891723    2876 certs.go:256] generating profile certs ...
	I0831 15:29:33.891775    2876 certs.go:363] generating signed profile cert for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:29:33.891786    2876 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt with IP's: []
	I0831 15:29:34.044423    2876 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt ...
	I0831 15:29:34.044439    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt: {Name:mkff87193f625d157d1a4f89b0da256c90604083 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.044784    2876 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key ...
	I0831 15:29:34.044793    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key: {Name:mke1833d9b208b07a8ff6dd57d320eb167de83a3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.045031    2876 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.72b12f93
	I0831 15:29:34.045046    2876 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.72b12f93 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.254]
	I0831 15:29:34.207099    2876 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.72b12f93 ...
	I0831 15:29:34.207118    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.72b12f93: {Name:mk38f2742462440beada92d4e254471d0fe85db9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.207433    2876 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.72b12f93 ...
	I0831 15:29:34.207443    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.72b12f93: {Name:mk29a130e2c97d3f060f247819d7c01c723a8502 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.207661    2876 certs.go:381] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.72b12f93 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt
	I0831 15:29:34.207842    2876 certs.go:385] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.72b12f93 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key
	I0831 15:29:34.208036    2876 certs.go:363] generating signed profile cert for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:29:34.208050    2876 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt with IP's: []
	I0831 15:29:34.314095    2876 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt ...
	I0831 15:29:34.314111    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt: {Name:mk708e4939e774d52c9a7d3335e0202d13493538 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.314481    2876 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key ...
	I0831 15:29:34.314489    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key: {Name:mkcfbb0611781f7e5640984b0a9cc91976dc5482 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:34.314700    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:29:34.314732    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:29:34.314751    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:29:34.314769    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:29:34.314787    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:29:34.314811    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:29:34.314831    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:29:34.314850    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:29:34.314947    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:29:34.314997    2876 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:29:34.315005    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:29:34.315034    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:29:34.315062    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:29:34.315091    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:29:34.315155    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:29:34.315187    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:29:34.315211    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:29:34.315229    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:29:34.315668    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:29:34.335288    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:29:34.355233    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:29:34.374357    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:29:34.393538    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I0831 15:29:34.413840    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0831 15:29:34.433106    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:29:34.452816    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:29:34.472204    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:29:34.492102    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:29:34.512126    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:29:34.530945    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0831 15:29:34.546877    2876 ssh_runner.go:195] Run: openssl version
	I0831 15:29:34.551681    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:29:34.565047    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:29:34.568688    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:29:34.568737    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:29:34.573250    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:29:34.587250    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:29:34.595871    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:29:34.599208    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:29:34.599248    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:29:34.603521    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:29:34.611689    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:29:34.620193    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:29:34.624378    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:29:34.624428    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:29:34.628785    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:29:34.637154    2876 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:29:34.640263    2876 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 15:29:34.640305    2876 kubeadm.go:392] StartCluster: {Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 C
lusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountTy
pe:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:29:34.640393    2876 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0831 15:29:34.652254    2876 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0831 15:29:34.660013    2876 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0831 15:29:34.668312    2876 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0831 15:29:34.675860    2876 kubeadm.go:155] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0831 15:29:34.675868    2876 kubeadm.go:157] found existing configuration files:
	
	I0831 15:29:34.675907    2876 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0831 15:29:34.683169    2876 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I0831 15:29:34.683212    2876 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I0831 15:29:34.690543    2876 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0831 15:29:34.697493    2876 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0831 15:29:34.697539    2876 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I0831 15:29:34.704850    2876 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0831 15:29:34.712593    2876 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0831 15:29:34.712643    2876 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0831 15:29:34.720047    2876 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0831 15:29:34.727239    2876 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0831 15:29:34.727279    2876 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0831 15:29:34.734575    2876 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem"
	I0831 15:29:34.806234    2876 kubeadm.go:310] [init] Using Kubernetes version: v1.31.0
	I0831 15:29:34.806318    2876 kubeadm.go:310] [preflight] Running pre-flight checks
	I0831 15:29:34.880330    2876 kubeadm.go:310] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0831 15:29:34.880424    2876 kubeadm.go:310] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0831 15:29:34.880492    2876 kubeadm.go:310] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I0831 15:29:34.888288    2876 kubeadm.go:310] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0831 15:29:34.931799    2876 out.go:235]   - Generating certificates and keys ...
	I0831 15:29:34.931855    2876 kubeadm.go:310] [certs] Using existing ca certificate authority
	I0831 15:29:34.931917    2876 kubeadm.go:310] [certs] Using existing apiserver certificate and key on disk
	I0831 15:29:35.094247    2876 kubeadm.go:310] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0831 15:29:35.242021    2876 kubeadm.go:310] [certs] Generating "front-proxy-ca" certificate and key
	I0831 15:29:35.553368    2876 kubeadm.go:310] [certs] Generating "front-proxy-client" certificate and key
	I0831 15:29:35.874778    2876 kubeadm.go:310] [certs] Generating "etcd/ca" certificate and key
	I0831 15:29:36.045823    2876 kubeadm.go:310] [certs] Generating "etcd/server" certificate and key
	I0831 15:29:36.046072    2876 kubeadm.go:310] [certs] etcd/server serving cert is signed for DNS names [ha-949000 localhost] and IPs [192.169.0.5 127.0.0.1 ::1]
	I0831 15:29:36.253528    2876 kubeadm.go:310] [certs] Generating "etcd/peer" certificate and key
	I0831 15:29:36.253651    2876 kubeadm.go:310] [certs] etcd/peer serving cert is signed for DNS names [ha-949000 localhost] and IPs [192.169.0.5 127.0.0.1 ::1]
	I0831 15:29:36.362185    2876 kubeadm.go:310] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0831 15:29:36.481613    2876 kubeadm.go:310] [certs] Generating "apiserver-etcd-client" certificate and key
	I0831 15:29:36.595099    2876 kubeadm.go:310] [certs] Generating "sa" key and public key
	I0831 15:29:36.595231    2876 kubeadm.go:310] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0831 15:29:36.687364    2876 kubeadm.go:310] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0831 15:29:36.786350    2876 kubeadm.go:310] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I0831 15:29:36.838505    2876 kubeadm.go:310] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0831 15:29:37.183406    2876 kubeadm.go:310] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0831 15:29:37.330529    2876 kubeadm.go:310] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0831 15:29:37.331123    2876 kubeadm.go:310] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0831 15:29:37.332869    2876 kubeadm.go:310] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0831 15:29:37.354639    2876 out.go:235]   - Booting up control plane ...
	I0831 15:29:37.354715    2876 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0831 15:29:37.354798    2876 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0831 15:29:37.354856    2876 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0831 15:29:37.354940    2876 kubeadm.go:310] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0831 15:29:37.355015    2876 kubeadm.go:310] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0831 15:29:37.355046    2876 kubeadm.go:310] [kubelet-start] Starting the kubelet
	I0831 15:29:37.462381    2876 kubeadm.go:310] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I0831 15:29:37.462478    2876 kubeadm.go:310] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I0831 15:29:37.972217    2876 kubeadm.go:310] [kubelet-check] The kubelet is healthy after 510.286911ms
	I0831 15:29:37.972306    2876 kubeadm.go:310] [api-check] Waiting for a healthy API server. This can take up to 4m0s
	I0831 15:29:43.988604    2876 kubeadm.go:310] [api-check] The API server is healthy after 6.020603512s
	I0831 15:29:44.000520    2876 kubeadm.go:310] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0831 15:29:44.008573    2876 kubeadm.go:310] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0831 15:29:44.022134    2876 kubeadm.go:310] [upload-certs] Skipping phase. Please see --upload-certs
	I0831 15:29:44.022318    2876 kubeadm.go:310] [mark-control-plane] Marking the node ha-949000 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0831 15:29:44.029102    2876 kubeadm.go:310] [bootstrap-token] Using token: zw6kb9.o9r4potygin4i7x2
	I0831 15:29:44.050780    2876 out.go:235]   - Configuring RBAC rules ...
	I0831 15:29:44.050942    2876 kubeadm.go:310] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0831 15:29:44.094287    2876 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0831 15:29:44.099052    2876 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0831 15:29:44.101377    2876 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0831 15:29:44.103328    2876 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0831 15:29:44.105426    2876 kubeadm.go:310] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0831 15:29:44.395210    2876 kubeadm.go:310] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0831 15:29:44.821705    2876 kubeadm.go:310] [addons] Applied essential addon: CoreDNS
	I0831 15:29:45.395130    2876 kubeadm.go:310] [addons] Applied essential addon: kube-proxy
	I0831 15:29:45.396108    2876 kubeadm.go:310] 
	I0831 15:29:45.396158    2876 kubeadm.go:310] Your Kubernetes control-plane has initialized successfully!
	I0831 15:29:45.396163    2876 kubeadm.go:310] 
	I0831 15:29:45.396236    2876 kubeadm.go:310] To start using your cluster, you need to run the following as a regular user:
	I0831 15:29:45.396245    2876 kubeadm.go:310] 
	I0831 15:29:45.396264    2876 kubeadm.go:310]   mkdir -p $HOME/.kube
	I0831 15:29:45.396314    2876 kubeadm.go:310]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0831 15:29:45.396355    2876 kubeadm.go:310]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0831 15:29:45.396359    2876 kubeadm.go:310] 
	I0831 15:29:45.396397    2876 kubeadm.go:310] Alternatively, if you are the root user, you can run:
	I0831 15:29:45.396406    2876 kubeadm.go:310] 
	I0831 15:29:45.396453    2876 kubeadm.go:310]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0831 15:29:45.396458    2876 kubeadm.go:310] 
	I0831 15:29:45.396496    2876 kubeadm.go:310] You should now deploy a pod network to the cluster.
	I0831 15:29:45.396560    2876 kubeadm.go:310] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0831 15:29:45.396617    2876 kubeadm.go:310]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0831 15:29:45.396623    2876 kubeadm.go:310] 
	I0831 15:29:45.396691    2876 kubeadm.go:310] You can now join any number of control-plane nodes by copying certificate authorities
	I0831 15:29:45.396760    2876 kubeadm.go:310] and service account keys on each node and then running the following as root:
	I0831 15:29:45.396766    2876 kubeadm.go:310] 
	I0831 15:29:45.396839    2876 kubeadm.go:310]   kubeadm join control-plane.minikube.internal:8443 --token zw6kb9.o9r4potygin4i7x2 \
	I0831 15:29:45.396919    2876 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 \
	I0831 15:29:45.396939    2876 kubeadm.go:310] 	--control-plane 
	I0831 15:29:45.396943    2876 kubeadm.go:310] 
	I0831 15:29:45.397018    2876 kubeadm.go:310] Then you can join any number of worker nodes by running the following on each as root:
	I0831 15:29:45.397029    2876 kubeadm.go:310] 
	I0831 15:29:45.397093    2876 kubeadm.go:310] kubeadm join control-plane.minikube.internal:8443 --token zw6kb9.o9r4potygin4i7x2 \
	I0831 15:29:45.397173    2876 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 
	I0831 15:29:45.397526    2876 kubeadm.go:310] W0831 22:29:34.618825    1608 common.go:101] your configuration file uses a deprecated API spec: "kubeadm.k8s.io/v1beta3" (kind: "ClusterConfiguration"). Please use 'kubeadm config migrate --old-config old.yaml --new-config new.yaml', which will write the new, similar spec using a newer API version.
	I0831 15:29:45.397751    2876 kubeadm.go:310] W0831 22:29:34.619993    1608 common.go:101] your configuration file uses a deprecated API spec: "kubeadm.k8s.io/v1beta3" (kind: "InitConfiguration"). Please use 'kubeadm config migrate --old-config old.yaml --new-config new.yaml', which will write the new, similar spec using a newer API version.
	I0831 15:29:45.397847    2876 kubeadm.go:310] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0831 15:29:45.397857    2876 cni.go:84] Creating CNI manager for ""
	I0831 15:29:45.397874    2876 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0831 15:29:45.420531    2876 out.go:177] * Configuring CNI (Container Networking Interface) ...
	I0831 15:29:45.477445    2876 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I0831 15:29:45.482633    2876 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.31.0/kubectl ...
	I0831 15:29:45.482643    2876 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2601 bytes)
	I0831 15:29:45.498168    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I0831 15:29:45.749965    2876 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0831 15:29:45.750050    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-949000 minikube.k8s.io/updated_at=2024_08_31T15_29_45_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2 minikube.k8s.io/name=ha-949000 minikube.k8s.io/primary=true
	I0831 15:29:45.750061    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:45.882304    2876 ops.go:34] apiserver oom_adj: -16
	I0831 15:29:45.896818    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:46.398021    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:46.897815    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:47.397274    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:47.897049    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:48.397593    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0831 15:29:48.462357    2876 kubeadm.go:1113] duration metric: took 2.712335704s to wait for elevateKubeSystemPrivileges
	I0831 15:29:48.462374    2876 kubeadm.go:394] duration metric: took 13.821875392s to StartCluster
	I0831 15:29:48.462389    2876 settings.go:142] acquiring lock: {Name:mk4b1b0a7439feab82be8f6d66b4d3c4d11c9b5f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:48.462482    2876 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:29:48.462909    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/kubeconfig: {Name:mkc7259a3f17d77b84078e55eed4ed8b5d2486ca Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:29:48.463157    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0831 15:29:48.463168    2876 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:29:48.463181    2876 start.go:241] waiting for startup goroutines ...
	I0831 15:29:48.463194    2876 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0831 15:29:48.463223    2876 addons.go:69] Setting storage-provisioner=true in profile "ha-949000"
	I0831 15:29:48.463228    2876 addons.go:69] Setting default-storageclass=true in profile "ha-949000"
	I0831 15:29:48.463245    2876 addons.go:234] Setting addon storage-provisioner=true in "ha-949000"
	I0831 15:29:48.463250    2876 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "ha-949000"
	I0831 15:29:48.463260    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:29:48.463303    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:29:48.463512    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:48.463518    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:48.463528    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:48.463540    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:48.472681    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51052
	I0831 15:29:48.473013    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51054
	I0831 15:29:48.473095    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:48.473332    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:48.473451    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:48.473463    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:48.473652    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:48.473665    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:48.473689    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:48.473921    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:48.474101    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:29:48.474113    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:48.474145    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:48.474214    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:48.474299    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:48.476440    2876 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:29:48.476667    2876 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x48c7c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0831 15:29:48.477025    2876 cert_rotation.go:140] Starting client certificate rotation controller
	I0831 15:29:48.477197    2876 addons.go:234] Setting addon default-storageclass=true in "ha-949000"
	I0831 15:29:48.477218    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:29:48.477428    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:48.477442    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:48.483175    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51056
	I0831 15:29:48.483519    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:48.483886    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:48.483904    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:48.484146    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:48.484254    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:29:48.484334    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:48.484406    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:48.485343    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:48.485904    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51058
	I0831 15:29:48.486187    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:48.486486    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:48.486495    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:48.486696    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:48.487040    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:48.487078    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:48.495680    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51060
	I0831 15:29:48.496017    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:48.496360    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:48.496389    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:48.496611    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:48.496715    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:29:48.496791    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:48.496872    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:29:48.497794    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:29:48.497926    2876 addons.go:431] installing /etc/kubernetes/addons/storageclass.yaml
	I0831 15:29:48.497934    2876 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0831 15:29:48.497944    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:48.498021    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:48.498099    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:48.498200    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:48.498277    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:48.507200    2876 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0831 15:29:48.527696    2876 addons.go:431] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0831 15:29:48.527708    2876 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0831 15:29:48.527725    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:29:48.527878    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:29:48.527981    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:29:48.528082    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:29:48.528217    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:29:48.528370    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.169.0.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0831 15:29:48.564053    2876 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0831 15:29:48.586435    2876 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0831 15:29:48.827708    2876 start.go:971] {"host.minikube.internal": 192.169.0.1} host record injected into CoreDNS's ConfigMap
	I0831 15:29:48.827730    2876 main.go:141] libmachine: Making call to close driver server
	I0831 15:29:48.827739    2876 main.go:141] libmachine: (ha-949000) Calling .Close
	I0831 15:29:48.827907    2876 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:29:48.827916    2876 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:29:48.827922    2876 main.go:141] libmachine: Making call to close driver server
	I0831 15:29:48.827926    2876 main.go:141] libmachine: (ha-949000) Calling .Close
	I0831 15:29:48.828046    2876 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:29:48.828049    2876 main.go:141] libmachine: (ha-949000) DBG | Closing plugin on server side
	I0831 15:29:48.828058    2876 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:29:48.828113    2876 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I0831 15:29:48.828125    2876 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I0831 15:29:48.828210    2876 round_trippers.go:463] GET https://192.169.0.254:8443/apis/storage.k8s.io/v1/storageclasses
	I0831 15:29:48.828215    2876 round_trippers.go:469] Request Headers:
	I0831 15:29:48.828223    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:29:48.828227    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:29:48.833724    2876 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0831 15:29:48.834156    2876 round_trippers.go:463] PUT https://192.169.0.254:8443/apis/storage.k8s.io/v1/storageclasses/standard
	I0831 15:29:48.834163    2876 round_trippers.go:469] Request Headers:
	I0831 15:29:48.834169    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:29:48.834199    2876 round_trippers.go:473]     Content-Type: application/json
	I0831 15:29:48.834205    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:29:48.835718    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:29:48.835861    2876 main.go:141] libmachine: Making call to close driver server
	I0831 15:29:48.835876    2876 main.go:141] libmachine: (ha-949000) Calling .Close
	I0831 15:29:48.836028    2876 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:29:48.836037    2876 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:29:48.836048    2876 main.go:141] libmachine: (ha-949000) DBG | Closing plugin on server side
	I0831 15:29:49.019783    2876 main.go:141] libmachine: Making call to close driver server
	I0831 15:29:49.019796    2876 main.go:141] libmachine: (ha-949000) Calling .Close
	I0831 15:29:49.019979    2876 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:29:49.019989    2876 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:29:49.019994    2876 main.go:141] libmachine: Making call to close driver server
	I0831 15:29:49.019999    2876 main.go:141] libmachine: (ha-949000) Calling .Close
	I0831 15:29:49.019999    2876 main.go:141] libmachine: (ha-949000) DBG | Closing plugin on server side
	I0831 15:29:49.020151    2876 main.go:141] libmachine: Successfully made call to close driver server
	I0831 15:29:49.020153    2876 main.go:141] libmachine: (ha-949000) DBG | Closing plugin on server side
	I0831 15:29:49.020159    2876 main.go:141] libmachine: Making call to close connection to plugin binary
	I0831 15:29:49.059498    2876 out.go:177] * Enabled addons: default-storageclass, storage-provisioner
	I0831 15:29:49.117324    2876 addons.go:510] duration metric: took 654.121351ms for enable addons: enabled=[default-storageclass storage-provisioner]
	I0831 15:29:49.117374    2876 start.go:246] waiting for cluster config update ...
	I0831 15:29:49.117390    2876 start.go:255] writing updated cluster config ...
	I0831 15:29:49.155430    2876 out.go:201] 
	I0831 15:29:49.192527    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:29:49.192625    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:29:49.214378    2876 out.go:177] * Starting "ha-949000-m02" control-plane node in "ha-949000" cluster
	I0831 15:29:49.272137    2876 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:29:49.272171    2876 cache.go:56] Caching tarball of preloaded images
	I0831 15:29:49.272338    2876 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:29:49.272356    2876 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:29:49.272445    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:29:49.273113    2876 start.go:360] acquireMachinesLock for ha-949000-m02: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:29:49.273204    2876 start.go:364] duration metric: took 68.322µs to acquireMachinesLock for "ha-949000-m02"
	I0831 15:29:49.273234    2876 start.go:93] Provisioning new machine with config: &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks
:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m02 IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:29:49.273329    2876 start.go:125] createHost starting for "m02" (driver="hyperkit")
	I0831 15:29:49.296266    2876 out.go:235] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0831 15:29:49.296429    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:29:49.296488    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:29:49.306391    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51065
	I0831 15:29:49.306732    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:29:49.307039    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:29:49.307051    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:29:49.307254    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:29:49.307374    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:29:49.307457    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:29:49.307559    2876 start.go:159] libmachine.API.Create for "ha-949000" (driver="hyperkit")
	I0831 15:29:49.307576    2876 client.go:168] LocalClient.Create starting
	I0831 15:29:49.307604    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem
	I0831 15:29:49.307643    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:29:49.307655    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:29:49.307696    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem
	I0831 15:29:49.307726    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:29:49.307735    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:29:49.307749    2876 main.go:141] libmachine: Running pre-create checks...
	I0831 15:29:49.307754    2876 main.go:141] libmachine: (ha-949000-m02) Calling .PreCreateCheck
	I0831 15:29:49.307836    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:49.307906    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetConfigRaw
	I0831 15:29:49.333695    2876 main.go:141] libmachine: Creating machine...
	I0831 15:29:49.333716    2876 main.go:141] libmachine: (ha-949000-m02) Calling .Create
	I0831 15:29:49.333916    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:49.334092    2876 main.go:141] libmachine: (ha-949000-m02) DBG | I0831 15:29:49.333909    2898 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:29:49.334195    2876 main.go:141] libmachine: (ha-949000-m02) Downloading /Users/jenkins/minikube-integration/18943-957/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso...
	I0831 15:29:49.534537    2876 main.go:141] libmachine: (ha-949000-m02) DBG | I0831 15:29:49.534440    2898 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa...
	I0831 15:29:49.629999    2876 main.go:141] libmachine: (ha-949000-m02) DBG | I0831 15:29:49.629917    2898 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk...
	I0831 15:29:49.630021    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Writing magic tar header
	I0831 15:29:49.630031    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Writing SSH key tar header
	I0831 15:29:49.630578    2876 main.go:141] libmachine: (ha-949000-m02) DBG | I0831 15:29:49.630526    2898 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02 ...
	I0831 15:29:49.986563    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:49.986593    2876 main.go:141] libmachine: (ha-949000-m02) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid
	I0831 15:29:49.986663    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Using UUID 23e5d675-5201-4f3d-86b7-b25c818528d1
	I0831 15:29:50.021467    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Generated MAC 92:7:3c:3f:ee:b7
	I0831 15:29:50.021484    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:29:50.021548    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"23e5d675-5201-4f3d-86b7-b25c818528d1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:29:50.021582    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"23e5d675-5201-4f3d-86b7-b25c818528d1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:29:50.021623    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "23e5d675-5201-4f3d-86b7-b25c818528d1", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:29:50.021665    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 23e5d675-5201-4f3d-86b7-b25c818528d1 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:29:50.021684    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:29:50.024624    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 DEBUG: hyperkit: Pid is 2899
	I0831 15:29:50.025044    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 0
	I0831 15:29:50.025058    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:50.025119    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:29:50.026207    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:29:50.026276    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:50.026305    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:50.026350    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:50.026373    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:50.026416    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:50.032754    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:29:50.041001    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:29:50.041896    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:29:50.041918    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:29:50.041929    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:29:50.041946    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:29:50.432260    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:29:50.432276    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:29:50.547071    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:29:50.547090    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:29:50.547112    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:29:50.547127    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:29:50.547965    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:29:50.547973    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:50 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:29:52.027270    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 1
	I0831 15:29:52.027288    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:52.027415    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:29:52.028177    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:29:52.028225    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:52.028236    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:52.028247    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:52.028254    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:52.028263    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:54.029110    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 2
	I0831 15:29:54.029126    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:54.029231    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:29:54.029999    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:29:54.030057    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:54.030075    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:54.030087    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:54.030095    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:54.030103    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:56.031274    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 3
	I0831 15:29:56.031292    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:56.031369    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:29:56.032155    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:29:56.032168    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:56.032178    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:56.032196    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:56.032213    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:56.032224    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:29:56.132338    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:56 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:29:56.132386    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:56 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:29:56.132396    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:56 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:29:56.155372    2876 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:29:56 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:29:58.032308    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 4
	I0831 15:29:58.032325    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:29:58.032424    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:29:58.033214    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:29:58.033247    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0831 15:29:58.033259    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:29:58.033269    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:29:58.033278    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:29:58.033287    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:30:00.033449    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 5
	I0831 15:30:00.033465    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:30:00.033544    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:30:00.034313    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:30:00.034404    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:30:00.034418    2876 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:30:00.034426    2876 main.go:141] libmachine: (ha-949000-m02) DBG | Found match: 92:7:3c:3f:ee:b7
	I0831 15:30:00.034433    2876 main.go:141] libmachine: (ha-949000-m02) DBG | IP: 192.169.0.6
	I0831 15:30:00.034475    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetConfigRaw
	I0831 15:30:00.035147    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:00.035249    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:00.035348    2876 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0831 15:30:00.035357    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:30:00.035434    2876 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:30:00.035493    2876 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 2899
	I0831 15:30:00.036274    2876 main.go:141] libmachine: Detecting operating system of created instance...
	I0831 15:30:00.036284    2876 main.go:141] libmachine: Waiting for SSH to be available...
	I0831 15:30:00.036289    2876 main.go:141] libmachine: Getting to WaitForSSH function...
	I0831 15:30:00.036293    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:00.036398    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:00.036485    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:00.036575    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:00.036655    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:00.036771    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:00.036969    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:00.036976    2876 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0831 15:30:01.059248    2876 main.go:141] libmachine: Error dialing TCP: ssh: handshake failed: ssh: unable to authenticate, attempted methods [none publickey], no supported methods remain
	I0831 15:30:04.124333    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:30:04.124345    2876 main.go:141] libmachine: Detecting the provisioner...
	I0831 15:30:04.124351    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.124488    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.124590    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.124683    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.124778    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.124921    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.125101    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.125110    2876 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0831 15:30:04.190272    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0831 15:30:04.190323    2876 main.go:141] libmachine: found compatible host: buildroot
	I0831 15:30:04.190329    2876 main.go:141] libmachine: Provisioning with buildroot...
	I0831 15:30:04.190334    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:30:04.190465    2876 buildroot.go:166] provisioning hostname "ha-949000-m02"
	I0831 15:30:04.190476    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:30:04.190558    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.190652    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.190763    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.190844    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.190943    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.191068    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.191204    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.191213    2876 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m02 && echo "ha-949000-m02" | sudo tee /etc/hostname
	I0831 15:30:04.267934    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m02
	
	I0831 15:30:04.267948    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.268081    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.268202    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.268299    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.268391    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.268525    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.268665    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.268684    2876 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:30:04.340314    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:30:04.340330    2876 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:30:04.340340    2876 buildroot.go:174] setting up certificates
	I0831 15:30:04.340346    2876 provision.go:84] configureAuth start
	I0831 15:30:04.340353    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:30:04.340483    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:30:04.340577    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.340665    2876 provision.go:143] copyHostCerts
	I0831 15:30:04.340691    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:30:04.340751    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:30:04.340757    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:30:04.340904    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:30:04.341121    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:30:04.341161    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:30:04.341166    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:30:04.341243    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:30:04.341390    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:30:04.341427    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:30:04.341432    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:30:04.341508    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:30:04.341670    2876 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m02 san=[127.0.0.1 192.169.0.6 ha-949000-m02 localhost minikube]
	I0831 15:30:04.509456    2876 provision.go:177] copyRemoteCerts
	I0831 15:30:04.509508    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:30:04.509523    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.509674    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.509762    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.509874    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.509973    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:30:04.550810    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:30:04.550883    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:30:04.571982    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:30:04.572058    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:30:04.592601    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:30:04.592680    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0831 15:30:04.612516    2876 provision.go:87] duration metric: took 272.157929ms to configureAuth
	I0831 15:30:04.612531    2876 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:30:04.612691    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:30:04.612706    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:04.612851    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.612970    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.613064    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.613150    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.613227    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.613345    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.613483    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.613491    2876 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:30:04.678333    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:30:04.678345    2876 buildroot.go:70] root file system type: tmpfs
	I0831 15:30:04.678436    2876 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:30:04.678450    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.678582    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.678669    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.678767    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.678846    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.678978    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.679124    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.679167    2876 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:30:04.756204    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:30:04.756224    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:04.756411    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:04.756527    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.756630    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:04.756734    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:04.756851    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:04.757006    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:04.757027    2876 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:30:06.370825    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:30:06.370840    2876 main.go:141] libmachine: Checking connection to Docker...
	I0831 15:30:06.370855    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetURL
	I0831 15:30:06.370996    2876 main.go:141] libmachine: Docker is up and running!
	I0831 15:30:06.371003    2876 main.go:141] libmachine: Reticulating splines...
	I0831 15:30:06.371008    2876 client.go:171] duration metric: took 17.063185858s to LocalClient.Create
	I0831 15:30:06.371017    2876 start.go:167] duration metric: took 17.063218984s to libmachine.API.Create "ha-949000"
	I0831 15:30:06.371023    2876 start.go:293] postStartSetup for "ha-949000-m02" (driver="hyperkit")
	I0831 15:30:06.371029    2876 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:30:06.371039    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:06.371176    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:30:06.371190    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:06.371279    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:06.371365    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:06.371448    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:06.371522    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:30:06.410272    2876 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:30:06.413456    2876 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:30:06.413467    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:30:06.413573    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:30:06.413753    2876 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:30:06.413762    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:30:06.413962    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:30:06.421045    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:30:06.440540    2876 start.go:296] duration metric: took 69.508758ms for postStartSetup
	I0831 15:30:06.440562    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetConfigRaw
	I0831 15:30:06.441179    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:30:06.441343    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:30:06.441726    2876 start.go:128] duration metric: took 17.168146238s to createHost
	I0831 15:30:06.441741    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:06.441826    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:06.441909    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:06.442008    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:06.442102    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:06.442220    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:30:06.442339    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:30:06.442346    2876 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:30:06.507669    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143406.563138986
	
	I0831 15:30:06.507682    2876 fix.go:216] guest clock: 1725143406.563138986
	I0831 15:30:06.507687    2876 fix.go:229] Guest: 2024-08-31 15:30:06.563138986 -0700 PDT Remote: 2024-08-31 15:30:06.441735 -0700 PDT m=+57.202103081 (delta=121.403986ms)
	I0831 15:30:06.507698    2876 fix.go:200] guest clock delta is within tolerance: 121.403986ms
	I0831 15:30:06.507701    2876 start.go:83] releasing machines lock for "ha-949000-m02", held for 17.234244881s
	I0831 15:30:06.507719    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:06.507845    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:30:06.534518    2876 out.go:177] * Found network options:
	I0831 15:30:06.585154    2876 out.go:177]   - NO_PROXY=192.169.0.5
	W0831 15:30:06.608372    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:30:06.608434    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:06.609377    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:06.609624    2876 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:30:06.609725    2876 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:30:06.609763    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	W0831 15:30:06.609837    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:30:06.609978    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:06.609993    2876 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:30:06.610018    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:30:06.610265    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:06.610300    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:30:06.610460    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:06.610487    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:30:06.610621    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:30:06.610643    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:30:06.610806    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	W0831 15:30:06.649012    2876 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:30:06.649075    2876 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:30:06.693849    2876 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:30:06.693863    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:30:06.693938    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:30:06.709316    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:30:06.718380    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:30:06.727543    2876 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:30:06.727609    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:30:06.736698    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:30:06.745615    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:30:06.755140    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:30:06.764398    2876 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:30:06.773464    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:30:06.782661    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:30:06.791918    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:30:06.801132    2876 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:30:06.809259    2876 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:30:06.817528    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:06.918051    2876 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:30:06.937658    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:30:06.937726    2876 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:30:06.952225    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:30:06.964364    2876 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:30:06.981641    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:30:06.992676    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:30:07.003746    2876 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:30:07.061399    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:30:07.071765    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:30:07.086915    2876 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:30:07.089960    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:30:07.097339    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:30:07.110902    2876 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:30:07.218878    2876 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:30:07.327438    2876 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:30:07.327478    2876 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:30:07.343077    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:07.455166    2876 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:30:09.753051    2876 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.297833346s)
	I0831 15:30:09.753112    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:30:09.763410    2876 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:30:09.776197    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:30:09.788015    2876 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:30:09.886287    2876 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:30:09.979666    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:10.091986    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:30:10.105474    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:30:10.116526    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:10.223654    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:30:10.284365    2876 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:30:10.284447    2876 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:30:10.288841    2876 start.go:563] Will wait 60s for crictl version
	I0831 15:30:10.288894    2876 ssh_runner.go:195] Run: which crictl
	I0831 15:30:10.292674    2876 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:30:10.327492    2876 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:30:10.327571    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:30:10.348428    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:30:10.394804    2876 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:30:10.438643    2876 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:30:10.460438    2876 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:30:10.460677    2876 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:30:10.463911    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:30:10.474227    2876 mustload.go:65] Loading cluster: ha-949000
	I0831 15:30:10.474382    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:30:10.474620    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:30:10.474636    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:30:10.483465    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51091
	I0831 15:30:10.483852    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:30:10.484170    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:30:10.484182    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:30:10.484380    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:30:10.484504    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:30:10.484591    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:30:10.484661    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:30:10.485631    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:30:10.485888    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:30:10.485912    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:30:10.494468    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51093
	I0831 15:30:10.494924    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:30:10.495238    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:30:10.495250    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:30:10.495476    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:30:10.495585    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:30:10.495693    2876 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.6
	I0831 15:30:10.495700    2876 certs.go:194] generating shared ca certs ...
	I0831 15:30:10.495711    2876 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:30:10.495883    2876 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:30:10.495953    2876 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:30:10.495961    2876 certs.go:256] generating profile certs ...
	I0831 15:30:10.496069    2876 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:30:10.496092    2876 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.2cd83952
	I0831 15:30:10.496104    2876 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.2cd83952 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.254]
	I0831 15:30:10.585710    2876 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.2cd83952 ...
	I0831 15:30:10.585732    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.2cd83952: {Name:mkfd98043f041b827744dcc9a0bc27d9f7ba3a8d Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:30:10.586080    2876 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.2cd83952 ...
	I0831 15:30:10.586093    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.2cd83952: {Name:mk6025bd0561394827636d384e273ec532f21510 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:30:10.586307    2876 certs.go:381] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.2cd83952 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt
	I0831 15:30:10.586527    2876 certs.go:385] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.2cd83952 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key
	I0831 15:30:10.586791    2876 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:30:10.586800    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:30:10.586823    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:30:10.586842    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:30:10.586860    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:30:10.586879    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:30:10.586902    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:30:10.586921    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:30:10.586939    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:30:10.587027    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:30:10.587073    2876 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:30:10.587082    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:30:10.587115    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:30:10.587145    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:30:10.587174    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:30:10.587237    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:30:10.587271    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:30:10.587293    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:30:10.587312    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:30:10.587343    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:30:10.587493    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:30:10.587598    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:30:10.587689    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:30:10.587790    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:30:10.619319    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0831 15:30:10.622586    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0831 15:30:10.631798    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0831 15:30:10.634863    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0831 15:30:10.644806    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0831 15:30:10.648392    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0831 15:30:10.657224    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0831 15:30:10.660506    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1679 bytes)
	I0831 15:30:10.668998    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0831 15:30:10.672282    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0831 15:30:10.681734    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0831 15:30:10.685037    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0831 15:30:10.697579    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:30:10.717100    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:30:10.736755    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:30:10.757074    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:30:10.776635    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I0831 15:30:10.796052    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0831 15:30:10.815309    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:30:10.834549    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:30:10.854663    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:30:10.873734    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:30:10.892872    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:30:10.912223    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0831 15:30:10.925669    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0831 15:30:10.939310    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0831 15:30:10.952723    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1679 bytes)
	I0831 15:30:10.966203    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0831 15:30:10.980670    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0831 15:30:10.994195    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0831 15:30:11.007818    2876 ssh_runner.go:195] Run: openssl version
	I0831 15:30:11.012076    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:30:11.021306    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:30:11.024674    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:30:11.024710    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:30:11.028962    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:30:11.038172    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:30:11.048226    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:30:11.051704    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:30:11.051746    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:30:11.056026    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:30:11.065281    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:30:11.074586    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:30:11.077977    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:30:11.078018    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:30:11.082263    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:30:11.091560    2876 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:30:11.094606    2876 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 15:30:11.094641    2876 kubeadm.go:934] updating node {m02 192.169.0.6 8443 v1.31.0 docker true true} ...
	I0831 15:30:11.094696    2876 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.6
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:30:11.094712    2876 kube-vip.go:115] generating kube-vip config ...
	I0831 15:30:11.094743    2876 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:30:11.107306    2876 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:30:11.107348    2876 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:30:11.107400    2876 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:30:11.116476    2876 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.31.0: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.31.0': No such file or directory
	
	Initiating transfer...
	I0831 15:30:11.116538    2876 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.31.0
	I0831 15:30:11.125199    2876 download.go:107] Downloading: https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubelet.sha256 -> /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet
	I0831 15:30:11.125199    2876 download.go:107] Downloading: https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl.sha256 -> /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl
	I0831 15:30:11.125202    2876 download.go:107] Downloading: https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubeadm.sha256 -> /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm
	I0831 15:30:13.495982    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl -> /var/lib/minikube/binaries/v1.31.0/kubectl
	I0831 15:30:13.496079    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubectl
	I0831 15:30:13.499639    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubectl: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubectl': No such file or directory
	I0831 15:30:13.499660    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl --> /var/lib/minikube/binaries/v1.31.0/kubectl (56381592 bytes)
	I0831 15:30:14.245316    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm -> /var/lib/minikube/binaries/v1.31.0/kubeadm
	I0831 15:30:14.245403    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubeadm
	I0831 15:30:14.249019    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubeadm: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubeadm': No such file or directory
	I0831 15:30:14.249045    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm --> /var/lib/minikube/binaries/v1.31.0/kubeadm (58290328 bytes)
	I0831 15:30:14.305452    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:30:14.335903    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet -> /var/lib/minikube/binaries/v1.31.0/kubelet
	I0831 15:30:14.336035    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubelet
	I0831 15:30:14.348689    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubelet: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubelet': No such file or directory
	I0831 15:30:14.348746    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet --> /var/lib/minikube/binaries/v1.31.0/kubelet (76865848 bytes)
	I0831 15:30:14.608960    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0831 15:30:14.617331    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:30:14.630716    2876 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:30:14.643952    2876 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:30:14.657665    2876 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:30:14.660616    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:30:14.670825    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:14.766762    2876 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:30:14.782036    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:30:14.782341    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:30:14.782363    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:30:14.791218    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51120
	I0831 15:30:14.791554    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:30:14.791943    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:30:14.791962    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:30:14.792169    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:30:14.792281    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:30:14.792379    2876 start.go:317] joinCluster: &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 Clu
sterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpira
tion:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:30:14.792482    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm token create --print-join-command --ttl=0"
	I0831 15:30:14.792500    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:30:14.792589    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:30:14.792677    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:30:14.792804    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:30:14.792889    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:30:14.904364    2876 start.go:343] trying to join control-plane node "m02" to cluster: &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:30:14.904404    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm join control-plane.minikube.internal:8443 --token sa5gl8.nk4lqkhvqrn6uouk --discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-949000-m02 --control-plane --apiserver-advertise-address=192.169.0.6 --apiserver-bind-port=8443"
	I0831 15:30:43.067719    2876 ssh_runner.go:235] Completed: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm join control-plane.minikube.internal:8443 --token sa5gl8.nk4lqkhvqrn6uouk --discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-949000-m02 --control-plane --apiserver-advertise-address=192.169.0.6 --apiserver-bind-port=8443": (28.162893612s)
	I0831 15:30:43.067762    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo systemctl daemon-reload && sudo systemctl enable kubelet && sudo systemctl start kubelet"
	I0831 15:30:43.495593    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-949000-m02 minikube.k8s.io/updated_at=2024_08_31T15_30_43_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2 minikube.k8s.io/name=ha-949000 minikube.k8s.io/primary=false
	I0831 15:30:43.584878    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig taint nodes ha-949000-m02 node-role.kubernetes.io/control-plane:NoSchedule-
	I0831 15:30:43.672222    2876 start.go:319] duration metric: took 28.879433845s to joinCluster
	I0831 15:30:43.672264    2876 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:30:43.672464    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:30:43.696001    2876 out.go:177] * Verifying Kubernetes components...
	I0831 15:30:43.753664    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:30:43.969793    2876 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:30:43.995704    2876 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:30:43.995955    2876 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x48c7c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:30:43.995999    2876 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:30:43.996168    2876 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m02" to be "Ready" ...
	I0831 15:30:43.996224    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:43.996229    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:43.996246    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:43.996253    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:44.008886    2876 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0831 15:30:44.496443    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:44.496458    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:44.496465    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:44.496468    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:44.499732    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:44.996970    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:44.996984    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:44.996990    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:44.996993    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:45.000189    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:45.496917    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:45.496930    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:45.496936    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:45.496939    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:45.498866    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:30:45.996558    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:45.996579    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:45.996604    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:45.996626    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:45.999357    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:45.999667    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:46.496895    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:46.496907    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:46.496914    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:46.496917    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:46.499220    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:46.996382    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:46.996397    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:46.996403    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:46.996406    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:46.998788    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:47.497035    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:47.497048    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:47.497055    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:47.497059    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:47.499487    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:47.996662    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:47.996675    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:47.996695    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:47.996699    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:47.998935    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:48.496588    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:48.496603    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:48.496610    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:48.496613    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:48.498806    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:48.499160    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:48.996774    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:48.996800    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:48.996806    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:48.996810    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:48.998862    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:49.496728    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:49.496741    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:49.496748    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:49.496753    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:49.500270    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:49.996536    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:49.996548    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:49.996555    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:49.996560    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:49.998977    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:50.496423    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:50.496441    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:50.496452    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:50.496458    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:50.499488    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:50.499941    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:50.996502    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:50.996515    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:50.996520    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:50.996525    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:50.998339    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:30:51.496978    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:51.496999    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:51.497011    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:51.497018    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:51.499859    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:51.997186    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:51.997200    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:51.997207    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:51.997210    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:52.000228    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:52.498065    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:52.498084    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:52.498093    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:52.498097    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:52.500425    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:52.500868    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:52.996733    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:52.996786    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:52.996804    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:52.996819    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:52.999878    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:53.496732    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:53.496752    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:53.496764    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:53.496772    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:53.499723    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:53.996635    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:53.996698    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:53.996722    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:53.996730    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:54.000327    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:54.496855    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:54.496875    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:54.496883    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:54.496888    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:54.499247    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:54.996676    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:54.996692    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:54.996701    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:54.996706    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:54.999066    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:54.999477    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:55.496949    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:55.496960    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:55.496967    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:55.496971    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:55.499074    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:55.996611    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:55.996627    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:55.996644    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:55.996651    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:55.999061    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:56.497363    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:56.497376    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:56.497383    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:56.497386    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:56.499540    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:56.997791    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:56.997810    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:56.997822    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:56.997828    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:57.001116    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:57.001481    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:57.497843    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:57.497862    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:57.497874    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:57.497881    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:57.500770    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:57.998298    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:57.998324    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:57.998335    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:57.998344    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:58.002037    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:58.496643    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:58.496664    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:58.496677    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:58.496683    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:58.499466    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:30:58.997398    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:58.997468    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:58.997484    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:58.997490    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:59.000768    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:59.498644    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:59.498668    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:59.498680    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:59.498685    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:59.502573    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:30:59.503046    2876 node_ready.go:53] node "ha-949000-m02" has status "Ready":"False"
	I0831 15:30:59.996689    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:30:59.996715    2876 round_trippers.go:469] Request Headers:
	I0831 15:30:59.996765    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:30:59.996773    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:30:59.999409    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:00.496654    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:00.496668    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.496677    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.496681    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.498585    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.499019    2876 node_ready.go:49] node "ha-949000-m02" has status "Ready":"True"
	I0831 15:31:00.499031    2876 node_ready.go:38] duration metric: took 16.50261118s for node "ha-949000-m02" to be "Ready" ...
	I0831 15:31:00.499038    2876 pod_ready.go:36] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:31:00.499081    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:31:00.499087    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.499092    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.499095    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.502205    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:00.506845    2876 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.506892    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:31:00.506897    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.506903    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.506908    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.508659    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.509078    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:00.509085    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.509091    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.509094    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.510447    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.510831    2876 pod_ready.go:93] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:00.510839    2876 pod_ready.go:82] duration metric: took 3.983743ms for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.510852    2876 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.510887    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-snq8s
	I0831 15:31:00.510892    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.510897    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.510901    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.512274    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.512740    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:00.512747    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.512752    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.512757    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.514085    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.514446    2876 pod_ready.go:93] pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:00.514457    2876 pod_ready.go:82] duration metric: took 3.596287ms for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.514464    2876 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.514501    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000
	I0831 15:31:00.514506    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.514512    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.514515    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.517897    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:00.518307    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:00.518314    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.518320    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.518324    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.519756    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.520128    2876 pod_ready.go:93] pod "etcd-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:00.520138    2876 pod_ready.go:82] duration metric: took 5.668748ms for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.520144    2876 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.520177    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m02
	I0831 15:31:00.520182    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.520187    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.520191    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.521454    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.521852    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:00.521860    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.521865    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.521870    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.523054    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:00.523372    2876 pod_ready.go:93] pod "etcd-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:00.523381    2876 pod_ready.go:82] duration metric: took 3.231682ms for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.523393    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.698293    2876 request.go:632] Waited for 174.813181ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:31:00.698344    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:31:00.698420    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.698432    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.698439    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.701539    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:00.897673    2876 request.go:632] Waited for 195.424003ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:00.897783    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:00.897794    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:00.897805    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:00.897814    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:00.900981    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:00.901407    2876 pod_ready.go:93] pod "kube-apiserver-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:00.901419    2876 pod_ready.go:82] duration metric: took 378.015429ms for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:00.901429    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:01.097805    2876 request.go:632] Waited for 196.320526ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:31:01.097926    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:31:01.097936    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:01.097947    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:01.097955    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:01.100563    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:01.298122    2876 request.go:632] Waited for 197.162644ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:01.298157    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:01.298162    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:01.298168    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:01.298172    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:01.300402    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:01.300781    2876 pod_ready.go:93] pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:01.300791    2876 pod_ready.go:82] duration metric: took 399.34942ms for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:01.300807    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:01.497316    2876 request.go:632] Waited for 196.39746ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:31:01.497376    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:31:01.497387    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:01.497397    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:01.497405    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:01.500651    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:01.698231    2876 request.go:632] Waited for 196.759957ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:01.698322    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:01.698333    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:01.698344    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:01.698353    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:01.701256    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:01.701766    2876 pod_ready.go:93] pod "kube-controller-manager-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:01.701775    2876 pod_ready.go:82] duration metric: took 400.954779ms for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:01.701785    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:01.898783    2876 request.go:632] Waited for 196.946643ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:31:01.898903    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:31:01.898917    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:01.898929    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:01.898938    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:01.902347    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:02.097749    2876 request.go:632] Waited for 194.738931ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:02.097815    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:02.097824    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:02.097834    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:02.097843    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:02.101525    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:02.102016    2876 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:02.102028    2876 pod_ready.go:82] duration metric: took 400.230387ms for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:02.102037    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:02.296929    2876 request.go:632] Waited for 194.771963ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:31:02.296979    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:31:02.296996    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:02.297010    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:02.297016    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:02.300518    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:02.498356    2876 request.go:632] Waited for 197.140595ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:02.498409    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:02.498414    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:02.498421    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:02.498425    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:02.500151    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:02.500554    2876 pod_ready.go:93] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:02.500564    2876 pod_ready.go:82] duration metric: took 398.515508ms for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:02.500577    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:02.697756    2876 request.go:632] Waited for 197.121926ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:31:02.697847    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:31:02.697859    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:02.697871    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:02.697879    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:02.701227    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:02.896975    2876 request.go:632] Waited for 195.16614ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:02.897029    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:02.897044    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:02.897050    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:02.897054    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:02.899135    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:02.899494    2876 pod_ready.go:93] pod "kube-proxy-q7ndn" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:02.899504    2876 pod_ready.go:82] duration metric: took 398.915896ms for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:02.899511    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:03.098441    2876 request.go:632] Waited for 198.871316ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:31:03.098576    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:31:03.098587    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.098599    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.098606    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.101995    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:03.297740    2876 request.go:632] Waited for 194.927579ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:03.297801    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:31:03.297842    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.297855    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.297863    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.300956    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:03.301560    2876 pod_ready.go:93] pod "kube-scheduler-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:03.301572    2876 pod_ready.go:82] duration metric: took 402.049602ms for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:03.301580    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:03.498380    2876 request.go:632] Waited for 196.707011ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:31:03.498472    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:31:03.498482    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.498494    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.498505    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.502174    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:03.696864    2876 request.go:632] Waited for 194.200989ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:03.696916    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:31:03.696926    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.696938    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.696944    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.700327    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:03.700769    2876 pod_ready.go:93] pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:31:03.700782    2876 pod_ready.go:82] duration metric: took 399.189338ms for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:31:03.700791    2876 pod_ready.go:39] duration metric: took 3.201699285s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:31:03.700816    2876 api_server.go:52] waiting for apiserver process to appear ...
	I0831 15:31:03.700877    2876 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:31:03.712528    2876 api_server.go:72] duration metric: took 20.039964419s to wait for apiserver process to appear ...
	I0831 15:31:03.712539    2876 api_server.go:88] waiting for apiserver healthz status ...
	I0831 15:31:03.712554    2876 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0831 15:31:03.715722    2876 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0831 15:31:03.715760    2876 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0831 15:31:03.715765    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.715771    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.715775    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.716371    2876 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 15:31:03.716424    2876 api_server.go:141] control plane version: v1.31.0
	I0831 15:31:03.716433    2876 api_server.go:131] duration metric: took 3.890107ms to wait for apiserver health ...
	I0831 15:31:03.716440    2876 system_pods.go:43] waiting for kube-system pods to appear ...
	I0831 15:31:03.898331    2876 request.go:632] Waited for 181.827666ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:31:03.898385    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:31:03.898446    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:03.898465    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:03.898473    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:03.903436    2876 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:31:03.906746    2876 system_pods.go:59] 17 kube-system pods found
	I0831 15:31:03.906767    2876 system_pods.go:61] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:31:03.906771    2876 system_pods.go:61] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:31:03.906775    2876 system_pods.go:61] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:31:03.906778    2876 system_pods.go:61] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:31:03.906783    2876 system_pods.go:61] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:31:03.906786    2876 system_pods.go:61] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:31:03.906789    2876 system_pods.go:61] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:31:03.906793    2876 system_pods.go:61] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:31:03.906796    2876 system_pods.go:61] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:31:03.906799    2876 system_pods.go:61] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:31:03.906802    2876 system_pods.go:61] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:31:03.906805    2876 system_pods.go:61] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:31:03.906810    2876 system_pods.go:61] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:31:03.906814    2876 system_pods.go:61] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:31:03.906816    2876 system_pods.go:61] "kube-vip-ha-949000" [933b8e54-299e-44c1-8dea-69aba92adbd4] Running
	I0831 15:31:03.906819    2876 system_pods.go:61] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:31:03.906824    2876 system_pods.go:61] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:31:03.906830    2876 system_pods.go:74] duration metric: took 190.381994ms to wait for pod list to return data ...
	I0831 15:31:03.906835    2876 default_sa.go:34] waiting for default service account to be created ...
	I0831 15:31:04.096833    2876 request.go:632] Waited for 189.933385ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:31:04.096919    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:31:04.096929    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:04.096940    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:04.096947    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:04.100750    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:04.100942    2876 default_sa.go:45] found service account: "default"
	I0831 15:31:04.100955    2876 default_sa.go:55] duration metric: took 194.103228ms for default service account to be created ...
	I0831 15:31:04.100963    2876 system_pods.go:116] waiting for k8s-apps to be running ...
	I0831 15:31:04.297283    2876 request.go:632] Waited for 196.269925ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:31:04.297349    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:31:04.297359    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:04.297370    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:04.297380    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:04.301594    2876 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:31:04.305403    2876 system_pods.go:86] 17 kube-system pods found
	I0831 15:31:04.305414    2876 system_pods.go:89] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:31:04.305418    2876 system_pods.go:89] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:31:04.305421    2876 system_pods.go:89] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:31:04.305424    2876 system_pods.go:89] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:31:04.305427    2876 system_pods.go:89] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:31:04.305431    2876 system_pods.go:89] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:31:04.305434    2876 system_pods.go:89] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:31:04.305438    2876 system_pods.go:89] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:31:04.305440    2876 system_pods.go:89] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:31:04.305443    2876 system_pods.go:89] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:31:04.305446    2876 system_pods.go:89] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:31:04.305449    2876 system_pods.go:89] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:31:04.305452    2876 system_pods.go:89] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:31:04.305455    2876 system_pods.go:89] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:31:04.305457    2876 system_pods.go:89] "kube-vip-ha-949000" [933b8e54-299e-44c1-8dea-69aba92adbd4] Running
	I0831 15:31:04.305459    2876 system_pods.go:89] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:31:04.305462    2876 system_pods.go:89] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:31:04.305467    2876 system_pods.go:126] duration metric: took 204.496865ms to wait for k8s-apps to be running ...
	I0831 15:31:04.305472    2876 system_svc.go:44] waiting for kubelet service to be running ....
	I0831 15:31:04.305532    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:31:04.316332    2876 system_svc.go:56] duration metric: took 10.855844ms WaitForService to wait for kubelet
	I0831 15:31:04.316347    2876 kubeadm.go:582] duration metric: took 20.643776408s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:31:04.316359    2876 node_conditions.go:102] verifying NodePressure condition ...
	I0831 15:31:04.497360    2876 request.go:632] Waited for 180.939277ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0831 15:31:04.497396    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0831 15:31:04.497400    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:04.497406    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:04.497409    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:04.500112    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:04.500615    2876 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:31:04.500630    2876 node_conditions.go:123] node cpu capacity is 2
	I0831 15:31:04.500640    2876 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:31:04.500644    2876 node_conditions.go:123] node cpu capacity is 2
	I0831 15:31:04.500647    2876 node_conditions.go:105] duration metric: took 184.28246ms to run NodePressure ...
	I0831 15:31:04.500655    2876 start.go:241] waiting for startup goroutines ...
	I0831 15:31:04.500673    2876 start.go:255] writing updated cluster config ...
	I0831 15:31:04.522012    2876 out.go:201] 
	I0831 15:31:04.543188    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:31:04.543261    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:31:04.565062    2876 out.go:177] * Starting "ha-949000-m03" control-plane node in "ha-949000" cluster
	I0831 15:31:04.608029    2876 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:31:04.608097    2876 cache.go:56] Caching tarball of preloaded images
	I0831 15:31:04.608326    2876 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:31:04.608349    2876 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:31:04.608480    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:31:04.609474    2876 start.go:360] acquireMachinesLock for ha-949000-m03: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:31:04.609608    2876 start.go:364] duration metric: took 107.158µs to acquireMachinesLock for "ha-949000-m03"
	I0831 15:31:04.609644    2876 start.go:93] Provisioning new machine with config: &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ing
ress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror:
DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m03 IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:31:04.609770    2876 start.go:125] createHost starting for "m03" (driver="hyperkit")
	I0831 15:31:04.631012    2876 out.go:235] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0831 15:31:04.631142    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:31:04.631178    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:31:04.640831    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51128
	I0831 15:31:04.641212    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:31:04.641538    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:31:04.641551    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:31:04.641754    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:31:04.641864    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:31:04.641951    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:04.642054    2876 start.go:159] libmachine.API.Create for "ha-949000" (driver="hyperkit")
	I0831 15:31:04.642071    2876 client.go:168] LocalClient.Create starting
	I0831 15:31:04.642111    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem
	I0831 15:31:04.642169    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:31:04.642179    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:31:04.642217    2876 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem
	I0831 15:31:04.642255    2876 main.go:141] libmachine: Decoding PEM data...
	I0831 15:31:04.642264    2876 main.go:141] libmachine: Parsing certificate...
	I0831 15:31:04.642276    2876 main.go:141] libmachine: Running pre-create checks...
	I0831 15:31:04.642281    2876 main.go:141] libmachine: (ha-949000-m03) Calling .PreCreateCheck
	I0831 15:31:04.642379    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:04.642422    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetConfigRaw
	I0831 15:31:04.652222    2876 main.go:141] libmachine: Creating machine...
	I0831 15:31:04.652235    2876 main.go:141] libmachine: (ha-949000-m03) Calling .Create
	I0831 15:31:04.652380    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:04.652531    2876 main.go:141] libmachine: (ha-949000-m03) DBG | I0831 15:31:04.652372    3223 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:31:04.652595    2876 main.go:141] libmachine: (ha-949000-m03) Downloading /Users/jenkins/minikube-integration/18943-957/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso...
	I0831 15:31:04.967913    2876 main.go:141] libmachine: (ha-949000-m03) DBG | I0831 15:31:04.967796    3223 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa...
	I0831 15:31:05.218214    2876 main.go:141] libmachine: (ha-949000-m03) DBG | I0831 15:31:05.218148    3223 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/ha-949000-m03.rawdisk...
	I0831 15:31:05.218234    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Writing magic tar header
	I0831 15:31:05.218243    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Writing SSH key tar header
	I0831 15:31:05.219245    2876 main.go:141] libmachine: (ha-949000-m03) DBG | I0831 15:31:05.219093    3223 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03 ...
	I0831 15:31:05.777334    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:05.777394    2876 main.go:141] libmachine: (ha-949000-m03) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid
	I0831 15:31:05.777478    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Using UUID 3fdefe95-7552-4d5b-8412-6ae6e5c787bb
	I0831 15:31:05.805053    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Generated MAC fa:59:9e:3b:35:6d
	I0831 15:31:05.805071    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:31:05.805106    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"3fdefe95-7552-4d5b-8412-6ae6e5c787bb", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00011a5d0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:31:05.805131    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"3fdefe95-7552-4d5b-8412-6ae6e5c787bb", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00011a5d0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:31:05.805226    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "3fdefe95-7552-4d5b-8412-6ae6e5c787bb", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/ha-949000-m03.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:31:05.805279    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 3fdefe95-7552-4d5b-8412-6ae6e5c787bb -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/ha-949000-m03.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:31:05.805308    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:31:05.808244    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 DEBUG: hyperkit: Pid is 3227
	I0831 15:31:05.808817    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 0
	I0831 15:31:05.808830    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:05.808902    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:05.809826    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:05.809929    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:31:05.809949    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:31:05.809975    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:31:05.809992    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:31:05.810004    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:31:05.810013    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:31:05.816053    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:31:05.824689    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:31:05.825475    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:31:05.825495    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:31:05.825508    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:31:05.825518    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:05 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:31:06.214670    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:31:06.214691    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:31:06.330054    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:31:06.330074    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:31:06.330102    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:31:06.330119    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:31:06.330929    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:31:06.330943    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:06 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:31:07.810124    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 1
	I0831 15:31:07.810138    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:07.810246    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:07.811007    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:07.811057    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:31:07.811067    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:31:07.811076    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:31:07.811082    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:31:07.811088    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:31:07.811097    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:31:09.811187    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 2
	I0831 15:31:09.811200    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:09.811312    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:09.812186    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:09.812196    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:31:09.812205    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:31:09.812213    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:31:09.812234    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:31:09.812241    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:31:09.812249    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:31:11.813365    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 3
	I0831 15:31:11.813388    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:11.813446    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:11.814261    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:11.814310    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:31:11.814328    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:31:11.814337    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:31:11.814342    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:31:11.814361    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:31:11.814371    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:31:11.957428    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:11 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:31:11.957483    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:11 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:31:11.957496    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:11 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:31:11.981309    2876 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:31:11 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:31:13.815231    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 4
	I0831 15:31:13.815245    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:13.815334    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:13.816118    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:13.816176    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0831 15:31:13.816186    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4eae7}
	I0831 15:31:13.816194    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:31:13.816200    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:82:25:5f:bd:90:2f ID:1,82:25:5f:bd:90:2f Lease:0x66d4e96f}
	I0831 15:31:13.816208    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:f6:81:99:76:62:8e ID:1,f6:81:99:76:62:8e Lease:0x66d3974c}
	I0831 15:31:13.816220    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3a:82:3b:14:54:13 ID:1,3a:82:3b:14:54:13 Lease:0x66d4e532}
	I0831 15:31:15.816252    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 5
	I0831 15:31:15.816273    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:15.816393    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:15.817241    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:31:15.817305    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found 6 entries in /var/db/dhcpd_leases!
	I0831 15:31:15.817315    2876 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d4eb32}
	I0831 15:31:15.817332    2876 main.go:141] libmachine: (ha-949000-m03) DBG | Found match: fa:59:9e:3b:35:6d
	I0831 15:31:15.817339    2876 main.go:141] libmachine: (ha-949000-m03) DBG | IP: 192.169.0.7
	I0831 15:31:15.817379    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetConfigRaw
	I0831 15:31:15.817997    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:15.818096    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:15.818188    2876 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0831 15:31:15.818195    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetState
	I0831 15:31:15.818279    2876 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:15.818331    2876 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:31:15.819115    2876 main.go:141] libmachine: Detecting operating system of created instance...
	I0831 15:31:15.819122    2876 main.go:141] libmachine: Waiting for SSH to be available...
	I0831 15:31:15.819126    2876 main.go:141] libmachine: Getting to WaitForSSH function...
	I0831 15:31:15.819130    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:15.819211    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:15.819288    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:15.819367    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:15.819433    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:15.819544    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:15.819737    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:15.819744    2876 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0831 15:31:16.864414    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:31:16.864428    2876 main.go:141] libmachine: Detecting the provisioner...
	I0831 15:31:16.864434    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:16.864597    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:16.864686    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.864782    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.864877    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:16.865009    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:16.865163    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:16.865170    2876 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0831 15:31:16.911810    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0831 15:31:16.911850    2876 main.go:141] libmachine: found compatible host: buildroot
	I0831 15:31:16.911857    2876 main.go:141] libmachine: Provisioning with buildroot...
	I0831 15:31:16.911862    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:31:16.911989    2876 buildroot.go:166] provisioning hostname "ha-949000-m03"
	I0831 15:31:16.911998    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:31:16.912088    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:16.912161    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:16.912247    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.912323    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.912399    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:16.912532    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:16.912676    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:16.912685    2876 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m03 && echo "ha-949000-m03" | sudo tee /etc/hostname
	I0831 15:31:16.972401    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m03
	
	I0831 15:31:16.972418    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:16.972554    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:16.972683    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.972793    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:16.972889    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:16.973016    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:16.973150    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:16.973161    2876 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m03' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m03/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m03' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:31:17.026608    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:31:17.026626    2876 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:31:17.026635    2876 buildroot.go:174] setting up certificates
	I0831 15:31:17.026641    2876 provision.go:84] configureAuth start
	I0831 15:31:17.026647    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:31:17.026793    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:31:17.026903    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:17.026995    2876 provision.go:143] copyHostCerts
	I0831 15:31:17.027029    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:31:17.027088    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:31:17.027094    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:31:17.027236    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:31:17.027433    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:31:17.027471    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:31:17.027477    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:31:17.027559    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:31:17.027700    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:31:17.027737    2876 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:31:17.027742    2876 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:31:17.027813    2876 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:31:17.027956    2876 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m03 san=[127.0.0.1 192.169.0.7 ha-949000-m03 localhost minikube]
	I0831 15:31:17.258292    2876 provision.go:177] copyRemoteCerts
	I0831 15:31:17.258340    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:31:17.258353    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:17.258490    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:17.258583    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.258663    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:17.258746    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:31:17.289869    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:31:17.289967    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:31:17.308984    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:31:17.309048    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0831 15:31:17.328947    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:31:17.329010    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:31:17.348578    2876 provision.go:87] duration metric: took 321.944434ms to configureAuth
	I0831 15:31:17.348592    2876 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:31:17.348776    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:31:17.348791    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:17.348926    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:17.349023    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:17.349112    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.349190    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.349267    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:17.349365    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:17.349505    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:17.349513    2876 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:31:17.396974    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:31:17.396988    2876 buildroot.go:70] root file system type: tmpfs
	I0831 15:31:17.397075    2876 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:31:17.397087    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:17.397218    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:17.397314    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.397402    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.397507    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:17.397637    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:17.397789    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:17.397838    2876 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:31:17.455821    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:31:17.455842    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:17.455977    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:17.456072    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.456168    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:17.456252    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:17.456374    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:17.456520    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:17.456533    2876 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:31:19.032300    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:31:19.032316    2876 main.go:141] libmachine: Checking connection to Docker...
	I0831 15:31:19.032323    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetURL
	I0831 15:31:19.032456    2876 main.go:141] libmachine: Docker is up and running!
	I0831 15:31:19.032464    2876 main.go:141] libmachine: Reticulating splines...
	I0831 15:31:19.032468    2876 client.go:171] duration metric: took 14.391172658s to LocalClient.Create
	I0831 15:31:19.032480    2876 start.go:167] duration metric: took 14.391215349s to libmachine.API.Create "ha-949000"
	I0831 15:31:19.032489    2876 start.go:293] postStartSetup for "ha-949000-m03" (driver="hyperkit")
	I0831 15:31:19.032496    2876 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:31:19.032506    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:19.032660    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:31:19.032675    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:19.032767    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:19.032855    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:19.032947    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:19.033033    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:31:19.073938    2876 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:31:19.079886    2876 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:31:19.079901    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:31:19.080017    2876 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:31:19.080199    2876 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:31:19.080206    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:31:19.080413    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:31:19.092434    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:31:19.119963    2876 start.go:296] duration metric: took 87.46929ms for postStartSetup
	I0831 15:31:19.119990    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetConfigRaw
	I0831 15:31:19.120591    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:31:19.120767    2876 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:31:19.121161    2876 start.go:128] duration metric: took 14.512164484s to createHost
	I0831 15:31:19.121177    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:19.121269    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:19.121343    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:19.121419    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:19.121507    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:19.121631    2876 main.go:141] libmachine: Using SSH client type: native
	I0831 15:31:19.121747    2876 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x320bea0] 0x320ec00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:31:19.121754    2876 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:31:19.168319    2876 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143479.023948613
	
	I0831 15:31:19.168331    2876 fix.go:216] guest clock: 1725143479.023948613
	I0831 15:31:19.168337    2876 fix.go:229] Guest: 2024-08-31 15:31:19.023948613 -0700 PDT Remote: 2024-08-31 15:31:19.12117 -0700 PDT m=+129.881500927 (delta=-97.221387ms)
	I0831 15:31:19.168349    2876 fix.go:200] guest clock delta is within tolerance: -97.221387ms
	I0831 15:31:19.168354    2876 start.go:83] releasing machines lock for "ha-949000-m03", held for 14.559521208s
	I0831 15:31:19.168370    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:19.168508    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:31:19.193570    2876 out.go:177] * Found network options:
	I0831 15:31:19.255565    2876 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6
	W0831 15:31:19.295062    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:31:19.295088    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:31:19.295104    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:19.295822    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:19.296008    2876 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:31:19.296101    2876 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:31:19.296130    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	W0831 15:31:19.296153    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:31:19.296165    2876 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:31:19.296225    2876 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:31:19.296229    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:19.296236    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:31:19.296334    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:19.296350    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:31:19.296442    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:19.296455    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:31:19.296560    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:31:19.296581    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:31:19.296680    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	W0831 15:31:19.323572    2876 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:31:19.323629    2876 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:31:19.371272    2876 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:31:19.371294    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:31:19.371393    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:31:19.387591    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:31:19.396789    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:31:19.405160    2876 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:31:19.405208    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:31:19.413496    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:31:19.422096    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:31:19.430386    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:31:19.438699    2876 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:31:19.447187    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:31:19.455984    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:31:19.464947    2876 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:31:19.474438    2876 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:31:19.482528    2876 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:31:19.490487    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:19.582349    2876 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:31:19.599985    2876 start.go:495] detecting cgroup driver to use...
	I0831 15:31:19.600056    2876 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:31:19.612555    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:31:19.632269    2876 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:31:19.650343    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:31:19.661102    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:31:19.671812    2876 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:31:19.695791    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:31:19.706786    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:31:19.722246    2876 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:31:19.725125    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:31:19.732176    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:31:19.745845    2876 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:31:19.848832    2876 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:31:19.960260    2876 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:31:19.960281    2876 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:31:19.974005    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:20.073538    2876 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:31:22.469978    2876 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.396488217s)
	I0831 15:31:22.470044    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:31:22.482132    2876 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:31:22.494892    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:31:22.505113    2876 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:31:22.597737    2876 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:31:22.715451    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:22.823995    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:31:22.837904    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:31:22.849106    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:22.943937    2876 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:31:23.002374    2876 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:31:23.002452    2876 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:31:23.006859    2876 start.go:563] Will wait 60s for crictl version
	I0831 15:31:23.006916    2876 ssh_runner.go:195] Run: which crictl
	I0831 15:31:23.010129    2876 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:31:23.037227    2876 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:31:23.037307    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:31:23.056021    2876 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:31:23.095679    2876 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:31:23.119303    2876 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:31:23.162269    2876 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0831 15:31:23.183203    2876 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:31:23.183553    2876 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:31:23.187788    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:31:23.197219    2876 mustload.go:65] Loading cluster: ha-949000
	I0831 15:31:23.197405    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:31:23.197647    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:31:23.197669    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:31:23.206705    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51151
	I0831 15:31:23.207061    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:31:23.207432    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:31:23.207448    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:31:23.207666    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:31:23.207786    2876 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:31:23.207874    2876 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:31:23.207946    2876 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:31:23.208928    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:31:23.209186    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:31:23.209220    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:31:23.218074    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51153
	I0831 15:31:23.218433    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:31:23.218804    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:31:23.218819    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:31:23.219039    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:31:23.219165    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:31:23.219284    2876 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.7
	I0831 15:31:23.219289    2876 certs.go:194] generating shared ca certs ...
	I0831 15:31:23.219301    2876 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:31:23.219493    2876 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:31:23.219569    2876 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:31:23.219578    2876 certs.go:256] generating profile certs ...
	I0831 15:31:23.219685    2876 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:31:23.219705    2876 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.0c0868f3
	I0831 15:31:23.219719    2876 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.0c0868f3 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.7 192.169.0.254]
	I0831 15:31:23.437317    2876 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.0c0868f3 ...
	I0831 15:31:23.437340    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.0c0868f3: {Name:mk58aa028a0f003ebc9e4d90dc317cdac139f88f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:31:23.437643    2876 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.0c0868f3 ...
	I0831 15:31:23.437656    2876 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.0c0868f3: {Name:mkaffb8ad3060932ca991ed93b1f8350d31a48ee Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:31:23.437859    2876 certs.go:381] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.0c0868f3 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt
	I0831 15:31:23.438064    2876 certs.go:385] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.0c0868f3 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key
	I0831 15:31:23.438321    2876 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:31:23.438330    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:31:23.438352    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:31:23.438370    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:31:23.438423    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:31:23.438445    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:31:23.438467    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:31:23.438484    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:31:23.438502    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:31:23.438598    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:31:23.438648    2876 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:31:23.438657    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:31:23.438698    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:31:23.438737    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:31:23.438775    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:31:23.438861    2876 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:31:23.438902    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:31:23.438923    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:31:23.438941    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:31:23.438970    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:31:23.439126    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:31:23.439259    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:31:23.439370    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:31:23.439494    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:31:23.472129    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0831 15:31:23.475604    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0831 15:31:23.483468    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0831 15:31:23.486771    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0831 15:31:23.494732    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0831 15:31:23.497856    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0831 15:31:23.505900    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0831 15:31:23.509221    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1679 bytes)
	I0831 15:31:23.517853    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0831 15:31:23.521110    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0831 15:31:23.529522    2876 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0831 15:31:23.532921    2876 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0831 15:31:23.540561    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:31:23.560999    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:31:23.580941    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:31:23.601890    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:31:23.621742    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1444 bytes)
	I0831 15:31:23.642294    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0831 15:31:23.662119    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:31:23.682734    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:31:23.702621    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:31:23.722704    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:31:23.743032    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:31:23.763003    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0831 15:31:23.776540    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0831 15:31:23.790112    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0831 15:31:23.803743    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1679 bytes)
	I0831 15:31:23.817470    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0831 15:31:23.831871    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0831 15:31:23.845310    2876 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0831 15:31:23.858947    2876 ssh_runner.go:195] Run: openssl version
	I0831 15:31:23.863254    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:31:23.871668    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:31:23.875114    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:31:23.875147    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:31:23.879499    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:31:23.888263    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:31:23.896800    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:31:23.900783    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:31:23.900840    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:31:23.905239    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:31:23.913677    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:31:23.921998    2876 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:31:23.925382    2876 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:31:23.925421    2876 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:31:23.929547    2876 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:31:23.938211    2876 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:31:23.941244    2876 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 15:31:23.941280    2876 kubeadm.go:934] updating node {m03 192.169.0.7 8443 v1.31.0 docker true true} ...
	I0831 15:31:23.941346    2876 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m03 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.7
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:31:23.941365    2876 kube-vip.go:115] generating kube-vip config ...
	I0831 15:31:23.941403    2876 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:31:23.953552    2876 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:31:23.953594    2876 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:31:23.953640    2876 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:31:23.961797    2876 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.31.0: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.31.0': No such file or directory
	
	Initiating transfer...
	I0831 15:31:23.961850    2876 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.31.0
	I0831 15:31:23.970244    2876 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubeadm.sha256
	I0831 15:31:23.970245    2876 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl.sha256
	I0831 15:31:23.970248    2876 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubelet.sha256
	I0831 15:31:23.970260    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm -> /var/lib/minikube/binaries/v1.31.0/kubeadm
	I0831 15:31:23.970262    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl -> /var/lib/minikube/binaries/v1.31.0/kubectl
	I0831 15:31:23.970297    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:31:23.970351    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubeadm
	I0831 15:31:23.970358    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubectl
	I0831 15:31:23.982898    2876 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet -> /var/lib/minikube/binaries/v1.31.0/kubelet
	I0831 15:31:23.982926    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubeadm: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubeadm': No such file or directory
	I0831 15:31:23.982950    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm --> /var/lib/minikube/binaries/v1.31.0/kubeadm (58290328 bytes)
	I0831 15:31:23.982949    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubectl: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubectl': No such file or directory
	I0831 15:31:23.982968    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl --> /var/lib/minikube/binaries/v1.31.0/kubectl (56381592 bytes)
	I0831 15:31:23.983039    2876 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubelet
	I0831 15:31:24.006648    2876 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubelet: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubelet': No such file or directory
	I0831 15:31:24.006684    2876 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet --> /var/lib/minikube/binaries/v1.31.0/kubelet (76865848 bytes)
	I0831 15:31:24.520609    2876 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0831 15:31:24.528302    2876 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:31:24.542845    2876 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:31:24.556549    2876 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:31:24.581157    2876 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:31:24.584179    2876 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:31:24.593696    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:24.689916    2876 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:31:24.707403    2876 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:31:24.707700    2876 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:31:24.707728    2876 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:31:24.717047    2876 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51156
	I0831 15:31:24.717380    2876 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:31:24.717728    2876 main.go:141] libmachine: Using API Version  1
	I0831 15:31:24.717743    2876 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:31:24.718003    2876 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:31:24.718123    2876 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:31:24.718213    2876 start.go:317] joinCluster: &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 Clu
sterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:fals
e inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimi
zations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:31:24.718336    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm token create --print-join-command --ttl=0"
	I0831 15:31:24.718349    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:31:24.718430    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:31:24.718495    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:31:24.718573    2876 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:31:24.718638    2876 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:31:24.810129    2876 start.go:343] trying to join control-plane node "m03" to cluster: &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:31:24.810181    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm join control-plane.minikube.internal:8443 --token l0ka7f.9kdk1py3wyogvy9t --discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-949000-m03 --control-plane --apiserver-advertise-address=192.169.0.7 --apiserver-bind-port=8443"
	I0831 15:31:52.526613    2876 ssh_runner.go:235] Completed: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm join control-plane.minikube.internal:8443 --token l0ka7f.9kdk1py3wyogvy9t --discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-949000-m03 --control-plane --apiserver-advertise-address=192.169.0.7 --apiserver-bind-port=8443": (27.716564604s)
	I0831 15:31:52.526639    2876 ssh_runner.go:195] Run: /bin/bash -c "sudo systemctl daemon-reload && sudo systemctl enable kubelet && sudo systemctl start kubelet"
	I0831 15:31:53.011028    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-949000-m03 minikube.k8s.io/updated_at=2024_08_31T15_31_53_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2 minikube.k8s.io/name=ha-949000 minikube.k8s.io/primary=false
	I0831 15:31:53.087862    2876 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig taint nodes ha-949000-m03 node-role.kubernetes.io/control-plane:NoSchedule-
	I0831 15:31:53.172826    2876 start.go:319] duration metric: took 28.454760565s to joinCluster
	I0831 15:31:53.172884    2876 start.go:235] Will wait 6m0s for node &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:31:53.173075    2876 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:31:53.197446    2876 out.go:177] * Verifying Kubernetes components...
	I0831 15:31:53.254031    2876 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:31:53.535623    2876 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:31:53.558317    2876 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:31:53.558557    2876 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x48c7c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:31:53.558593    2876 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:31:53.558836    2876 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m03" to be "Ready" ...
	I0831 15:31:53.558893    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:53.558899    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:53.558906    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:53.558909    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:53.561151    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:54.058994    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:54.059009    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:54.059015    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:54.059020    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:54.061381    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:54.559376    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:54.559389    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:54.559396    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:54.559399    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:54.561772    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:55.059628    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:55.059676    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:55.059690    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:55.059700    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:55.063078    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:55.559418    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:55.559433    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:55.559439    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:55.559442    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:55.561338    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:55.561664    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:31:56.059758    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:56.059770    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:56.059776    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:56.059780    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:56.061794    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:56.560083    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:56.560095    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:56.560101    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:56.560105    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:56.562114    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:57.058995    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:57.059011    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:57.059017    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:57.059021    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:57.060963    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:57.560137    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:57.560149    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:57.560155    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:57.560159    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:57.561978    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:31:57.562328    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:31:58.059061    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:58.059074    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:58.059080    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:58.059086    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:58.061472    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:58.559244    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:58.559270    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:58.559282    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:58.559289    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:58.562722    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:59.060308    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:59.060330    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:59.060342    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:59.060359    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:59.063517    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:31:59.560099    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:31:59.560116    2876 round_trippers.go:469] Request Headers:
	I0831 15:31:59.560125    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:31:59.560129    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:31:59.562184    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:31:59.562628    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:00.059591    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:00.059615    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:00.059662    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:00.059677    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:00.063389    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:00.560430    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:00.560444    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:00.560451    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:00.560455    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:00.562483    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:01.059473    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:01.059498    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:01.059509    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:01.059514    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:01.062773    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:01.559271    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:01.559298    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:01.559310    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:01.559317    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:01.562641    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:01.563242    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:02.060140    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:02.060168    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:02.060211    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:02.060244    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:02.063601    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:02.559282    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:02.559308    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:02.559320    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:02.559329    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:02.562623    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:03.059890    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:03.059911    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:03.059923    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:03.059930    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:03.063409    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:03.559394    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:03.559453    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:03.559465    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:03.559470    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:03.562567    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:04.060698    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:04.060714    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:04.060719    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:04.060727    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:04.062955    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:04.063278    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:04.560096    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:04.560118    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:04.560165    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:04.560173    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:04.562791    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:05.060622    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:05.060648    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:05.060659    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:05.060665    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:05.064011    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:05.559954    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:05.559976    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:05.559988    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:05.559994    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:05.563422    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:06.059812    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:06.059870    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:06.059880    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:06.059886    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:06.062529    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:06.560071    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:06.560096    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:06.560107    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:06.560113    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:06.563538    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:06.564037    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:07.059298    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:07.059324    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:07.059335    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:07.059342    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:07.063048    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:07.559252    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:07.559277    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:07.559291    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:07.559297    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:07.562373    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:08.061149    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:08.061210    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:08.061223    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:08.061234    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:08.064402    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:08.559428    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:08.559452    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:08.559463    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:08.559468    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:08.562526    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:09.060827    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:09.060878    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:09.060891    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:09.060900    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:09.063954    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:09.064537    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:09.561212    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:09.561237    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:09.561283    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:09.561292    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:09.564677    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:10.060675    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:10.060694    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:10.060714    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:10.060718    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:10.062779    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:10.560397    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:10.560424    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:10.560435    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:10.560441    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:10.564079    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:11.060679    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:11.060705    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:11.060716    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:11.060722    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:11.064114    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:11.559466    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:11.559492    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:11.559503    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:11.559567    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:11.562752    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:11.563402    2876 node_ready.go:53] node "ha-949000-m03" has status "Ready":"False"
	I0831 15:32:12.059348    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:12.059373    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:12.059384    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:12.059389    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:12.062810    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:12.561048    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:12.561106    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:12.561120    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:12.561141    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:12.564459    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:13.059831    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:13.059855    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.059867    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.059873    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.063079    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:13.063582    2876 node_ready.go:49] node "ha-949000-m03" has status "Ready":"True"
	I0831 15:32:13.063594    2876 node_ready.go:38] duration metric: took 19.504599366s for node "ha-949000-m03" to be "Ready" ...
	I0831 15:32:13.063602    2876 pod_ready.go:36] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:32:13.063657    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:32:13.063665    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.063674    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.063682    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.067458    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:13.072324    2876 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.072373    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:32:13.072379    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.072385    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.072389    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.074327    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.074802    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:13.074810    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.074815    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.074820    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.076654    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.076987    2876 pod_ready.go:93] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.076996    2876 pod_ready.go:82] duration metric: took 4.661444ms for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.077003    2876 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.077041    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-snq8s
	I0831 15:32:13.077046    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.077052    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.077056    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.078862    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.079264    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:13.079271    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.079277    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.079280    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.081027    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.081326    2876 pod_ready.go:93] pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.081335    2876 pod_ready.go:82] duration metric: took 4.326858ms for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.081342    2876 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.081372    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000
	I0831 15:32:13.081379    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.081385    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.081388    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.083263    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.083632    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:13.083639    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.083645    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.083649    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.085181    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.085480    2876 pod_ready.go:93] pod "etcd-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.085490    2876 pod_ready.go:82] duration metric: took 4.142531ms for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.085497    2876 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.085526    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m02
	I0831 15:32:13.085531    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.085537    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.085541    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.087128    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.087501    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:13.087508    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.087513    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.087518    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.088959    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.089244    2876 pod_ready.go:93] pod "etcd-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.089252    2876 pod_ready.go:82] duration metric: took 3.751049ms for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.089258    2876 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.261887    2876 request.go:632] Waited for 172.592535ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m03
	I0831 15:32:13.261972    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m03
	I0831 15:32:13.261978    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.262019    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.262028    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.264296    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:13.460589    2876 request.go:632] Waited for 195.842812ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:13.460724    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:13.460735    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.460745    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.460759    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.463962    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:13.464378    2876 pod_ready.go:93] pod "etcd-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.464391    2876 pod_ready.go:82] duration metric: took 375.12348ms for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.464404    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.661862    2876 request.go:632] Waited for 197.406518ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:32:13.661977    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:32:13.661988    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.661999    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.662005    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.665393    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:13.861181    2876 request.go:632] Waited for 195.385788ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:13.861214    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:13.861218    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:13.861225    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:13.861260    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:13.863261    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:13.863567    2876 pod_ready.go:93] pod "kube-apiserver-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:13.863577    2876 pod_ready.go:82] duration metric: took 399.161484ms for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:13.863584    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:14.061861    2876 request.go:632] Waited for 198.232413ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:32:14.061952    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:32:14.061961    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:14.061972    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:14.061979    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:14.064530    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:14.260004    2876 request.go:632] Waited for 194.98208ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:14.260143    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:14.260166    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:14.260182    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:14.260227    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:14.266580    2876 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:32:14.266908    2876 pod_ready.go:93] pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:14.266927    2876 pod_ready.go:82] duration metric: took 403.325368ms for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:14.266937    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:14.460025    2876 request.go:632] Waited for 193.045445ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:32:14.460093    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:32:14.460101    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:14.460110    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:14.460117    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:14.462588    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:14.660940    2876 request.go:632] Waited for 197.721547ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:14.661070    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:14.661080    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:14.661096    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:14.661109    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:14.664541    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:14.664954    2876 pod_ready.go:93] pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:14.664967    2876 pod_ready.go:82] duration metric: took 398.020825ms for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:14.664979    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:14.861147    2876 request.go:632] Waited for 196.115866ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:32:14.861203    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:32:14.861211    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:14.861223    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:14.861231    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:14.864847    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:15.060912    2876 request.go:632] Waited for 195.310518ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:15.060968    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:15.060983    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:15.061000    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:15.061011    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:15.064271    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:15.064583    2876 pod_ready.go:93] pod "kube-controller-manager-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:15.064594    2876 pod_ready.go:82] duration metric: took 399.604845ms for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:15.064603    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:15.260515    2876 request.go:632] Waited for 195.841074ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:32:15.260662    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:32:15.260676    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:15.260688    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:15.260702    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:15.264411    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:15.461372    2876 request.go:632] Waited for 196.432681ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:15.461470    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:15.461484    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:15.461502    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:15.461513    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:15.464382    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:15.464683    2876 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:15.464691    2876 pod_ready.go:82] duration metric: took 400.078711ms for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:15.464700    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:15.660288    2876 request.go:632] Waited for 195.551444ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:32:15.660318    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:32:15.660323    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:15.660357    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:15.660363    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:15.663247    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:15.860473    2876 request.go:632] Waited for 196.823661ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:15.860532    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:15.860542    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:15.860556    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:15.860563    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:15.863954    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:15.864333    2876 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:15.864346    2876 pod_ready.go:82] duration metric: took 399.636293ms for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:15.864355    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:16.060306    2876 request.go:632] Waited for 195.900703ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:32:16.060410    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:32:16.060437    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:16.060449    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:16.060455    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:16.063745    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:16.260402    2876 request.go:632] Waited for 195.997957ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:16.260523    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:16.260539    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:16.260551    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:16.260563    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:16.264052    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:16.264373    2876 pod_ready.go:93] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:16.264385    2876 pod_ready.go:82] duration metric: took 400.01997ms for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:16.264394    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:16.461128    2876 request.go:632] Waited for 196.682855ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-d45q5
	I0831 15:32:16.461251    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-d45q5
	I0831 15:32:16.461264    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:16.461275    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:16.461282    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:16.464602    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:16.660248    2876 request.go:632] Waited for 195.08291ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:16.660298    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:16.660310    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:16.660327    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:16.660340    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:16.663471    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:16.664017    2876 pod_ready.go:93] pod "kube-proxy-d45q5" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:16.664029    2876 pod_ready.go:82] duration metric: took 399.623986ms for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:16.664038    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:16.859948    2876 request.go:632] Waited for 195.845325ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:32:16.860034    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:32:16.860060    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:16.860083    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:16.860094    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:16.863263    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:17.060250    2876 request.go:632] Waited for 196.410574ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:17.060307    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:17.060319    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:17.060334    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:17.060345    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:17.063664    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:17.064113    2876 pod_ready.go:93] pod "kube-proxy-q7ndn" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:17.064125    2876 pod_ready.go:82] duration metric: took 400.076522ms for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:17.064134    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:17.260150    2876 request.go:632] Waited for 195.935266ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:32:17.260232    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:32:17.260246    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:17.260305    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:17.260324    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:17.263756    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:17.460703    2876 request.go:632] Waited for 196.426241ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:17.460753    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:32:17.460765    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:17.460776    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:17.460799    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:17.463925    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:17.464439    2876 pod_ready.go:93] pod "kube-scheduler-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:17.464449    2876 pod_ready.go:82] duration metric: took 400.306164ms for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:17.464463    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:17.660506    2876 request.go:632] Waited for 196.00354ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:32:17.660541    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:32:17.660547    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:17.660553    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:17.660568    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:17.662504    2876 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:32:17.859973    2876 request.go:632] Waited for 197.106962ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:17.860023    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:32:17.860031    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:17.860084    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:17.860092    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:17.869330    2876 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0831 15:32:17.869629    2876 pod_ready.go:93] pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:17.869638    2876 pod_ready.go:82] duration metric: took 405.16449ms for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:17.869646    2876 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:18.060370    2876 request.go:632] Waited for 190.671952ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:32:18.060479    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:32:18.060492    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.060504    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.060511    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.063196    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:18.260902    2876 request.go:632] Waited for 197.387182ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:18.260947    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:32:18.260955    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.260976    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.261000    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.263780    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:18.264154    2876 pod_ready.go:93] pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:32:18.264163    2876 pod_ready.go:82] duration metric: took 394.508983ms for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:32:18.264171    2876 pod_ready.go:39] duration metric: took 5.200505122s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:32:18.264182    2876 api_server.go:52] waiting for apiserver process to appear ...
	I0831 15:32:18.264235    2876 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:32:18.276016    2876 api_server.go:72] duration metric: took 25.102905505s to wait for apiserver process to appear ...
	I0831 15:32:18.276029    2876 api_server.go:88] waiting for apiserver healthz status ...
	I0831 15:32:18.276040    2876 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0831 15:32:18.280474    2876 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0831 15:32:18.280519    2876 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0831 15:32:18.280525    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.280531    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.280535    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.281148    2876 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 15:32:18.281176    2876 api_server.go:141] control plane version: v1.31.0
	I0831 15:32:18.281184    2876 api_server.go:131] duration metric: took 5.150155ms to wait for apiserver health ...
	I0831 15:32:18.281189    2876 system_pods.go:43] waiting for kube-system pods to appear ...
	I0831 15:32:18.460471    2876 request.go:632] Waited for 179.236076ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:32:18.460573    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:32:18.460585    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.460596    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.460604    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.465317    2876 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:32:18.469906    2876 system_pods.go:59] 24 kube-system pods found
	I0831 15:32:18.469918    2876 system_pods.go:61] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:32:18.469921    2876 system_pods.go:61] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:32:18.469925    2876 system_pods.go:61] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:32:18.469928    2876 system_pods.go:61] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:32:18.469933    2876 system_pods.go:61] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:32:18.469937    2876 system_pods.go:61] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:32:18.469939    2876 system_pods.go:61] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:32:18.469943    2876 system_pods.go:61] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:32:18.469946    2876 system_pods.go:61] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:32:18.469949    2876 system_pods.go:61] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:32:18.469954    2876 system_pods.go:61] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:32:18.469958    2876 system_pods.go:61] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:32:18.469961    2876 system_pods.go:61] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:32:18.469963    2876 system_pods.go:61] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:32:18.469966    2876 system_pods.go:61] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:32:18.469969    2876 system_pods.go:61] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:32:18.469972    2876 system_pods.go:61] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:32:18.469975    2876 system_pods.go:61] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:32:18.469978    2876 system_pods.go:61] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:32:18.469980    2876 system_pods.go:61] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:32:18.469983    2876 system_pods.go:61] "kube-vip-ha-949000" [933b8e54-299e-44c1-8dea-69aba92adbd4] Running
	I0831 15:32:18.469985    2876 system_pods.go:61] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:32:18.469988    2876 system_pods.go:61] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:32:18.469990    2876 system_pods.go:61] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:32:18.469994    2876 system_pods.go:74] duration metric: took 188.799972ms to wait for pod list to return data ...
	I0831 15:32:18.470000    2876 default_sa.go:34] waiting for default service account to be created ...
	I0831 15:32:18.659945    2876 request.go:632] Waited for 189.894855ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:32:18.659986    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:32:18.660002    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.660011    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.660017    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.662843    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:18.662901    2876 default_sa.go:45] found service account: "default"
	I0831 15:32:18.662910    2876 default_sa.go:55] duration metric: took 192.903479ms for default service account to be created ...
	I0831 15:32:18.662915    2876 system_pods.go:116] waiting for k8s-apps to be running ...
	I0831 15:32:18.860267    2876 request.go:632] Waited for 197.296928ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:32:18.860299    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:32:18.860304    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:18.860310    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:18.860316    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:18.864052    2876 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:32:18.868873    2876 system_pods.go:86] 24 kube-system pods found
	I0831 15:32:18.868886    2876 system_pods.go:89] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:32:18.868891    2876 system_pods.go:89] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:32:18.868894    2876 system_pods.go:89] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:32:18.868897    2876 system_pods.go:89] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:32:18.868901    2876 system_pods.go:89] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:32:18.868904    2876 system_pods.go:89] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:32:18.868907    2876 system_pods.go:89] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:32:18.868912    2876 system_pods.go:89] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:32:18.868916    2876 system_pods.go:89] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:32:18.868918    2876 system_pods.go:89] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:32:18.868922    2876 system_pods.go:89] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:32:18.868927    2876 system_pods.go:89] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:32:18.868931    2876 system_pods.go:89] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:32:18.868934    2876 system_pods.go:89] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:32:18.868938    2876 system_pods.go:89] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:32:18.868941    2876 system_pods.go:89] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:32:18.868944    2876 system_pods.go:89] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:32:18.868947    2876 system_pods.go:89] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:32:18.868950    2876 system_pods.go:89] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:32:18.868953    2876 system_pods.go:89] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:32:18.868957    2876 system_pods.go:89] "kube-vip-ha-949000" [933b8e54-299e-44c1-8dea-69aba92adbd4] Running
	I0831 15:32:18.868959    2876 system_pods.go:89] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:32:18.868963    2876 system_pods.go:89] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:32:18.868966    2876 system_pods.go:89] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:32:18.868971    2876 system_pods.go:126] duration metric: took 206.049826ms to wait for k8s-apps to be running ...
	I0831 15:32:18.868980    2876 system_svc.go:44] waiting for kubelet service to be running ....
	I0831 15:32:18.869030    2876 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:32:18.880958    2876 system_svc.go:56] duration metric: took 11.976044ms WaitForService to wait for kubelet
	I0831 15:32:18.880978    2876 kubeadm.go:582] duration metric: took 25.707859659s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:32:18.880990    2876 node_conditions.go:102] verifying NodePressure condition ...
	I0831 15:32:19.060320    2876 request.go:632] Waited for 179.26426ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0831 15:32:19.060365    2876 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0831 15:32:19.060371    2876 round_trippers.go:469] Request Headers:
	I0831 15:32:19.060379    2876 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:32:19.060385    2876 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:32:19.063168    2876 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:32:19.063767    2876 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:32:19.063776    2876 node_conditions.go:123] node cpu capacity is 2
	I0831 15:32:19.063782    2876 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:32:19.063785    2876 node_conditions.go:123] node cpu capacity is 2
	I0831 15:32:19.063789    2876 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:32:19.063791    2876 node_conditions.go:123] node cpu capacity is 2
	I0831 15:32:19.063794    2876 node_conditions.go:105] duration metric: took 182.798166ms to run NodePressure ...
	I0831 15:32:19.063802    2876 start.go:241] waiting for startup goroutines ...
	I0831 15:32:19.063817    2876 start.go:255] writing updated cluster config ...
	I0831 15:32:19.064186    2876 ssh_runner.go:195] Run: rm -f paused
	I0831 15:32:19.107477    2876 start.go:600] kubectl: 1.29.2, cluster: 1.31.0 (minor skew: 2)
	I0831 15:32:19.128559    2876 out.go:201] 
	W0831 15:32:19.149451    2876 out.go:270] ! /usr/local/bin/kubectl is version 1.29.2, which may have incompatibilities with Kubernetes 1.31.0.
	I0831 15:32:19.170407    2876 out.go:177]   - Want kubectl v1.31.0? Try 'minikube kubectl -- get pods -A'
	I0831 15:32:19.212551    2876 out.go:177] * Done! kubectl is now configured to use "ha-949000" cluster and "default" namespace by default
	
	
	==> Docker <==
	Aug 31 22:30:08 ha-949000 cri-dockerd[1172]: time="2024-08-31T22:30:08Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/7da75377db13c80b27b99ccc9f52561a4408675361947cf393e0c38286a71997/resolv.conf as [nameserver 192.169.0.1]"
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.201910840Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.202112013Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.202132705Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.202328611Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:30:08 ha-949000 cri-dockerd[1172]: time="2024-08-31T22:30:08Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/1017bd5eac1d26de2df318c0dc0ac8d5db92d72e8c268401502a145b3ad0d9d8/resolv.conf as [nameserver 192.169.0.1]"
	Aug 31 22:30:08 ha-949000 cri-dockerd[1172]: time="2024-08-31T22:30:08Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/271da20951c9ab4102e979dc2b97b3a9c8d992db5fc7ebac3f954ea9edee9d48/resolv.conf as [nameserver 192.169.0.1]"
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.346950244Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.347136993Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.347223771Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.347348772Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.379063396Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.379210402Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.379226413Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:30:08 ha-949000 dockerd[1279]: time="2024-08-31T22:30:08.379336044Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:32:21 ha-949000 dockerd[1279]: time="2024-08-31T22:32:21.320619490Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:32:21 ha-949000 dockerd[1279]: time="2024-08-31T22:32:21.320945499Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:32:21 ha-949000 dockerd[1279]: time="2024-08-31T22:32:21.321018153Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:32:21 ha-949000 dockerd[1279]: time="2024-08-31T22:32:21.321131565Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:32:21 ha-949000 cri-dockerd[1172]: time="2024-08-31T22:32:21Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/f68483c946835415bfdf0531bfc6be41dd321162f4c19af555ece0f66ee7cabe/resolv.conf as [nameserver 10.96.0.10 search default.svc.cluster.local svc.cluster.local cluster.local options ndots:5]"
	Aug 31 22:32:22 ha-949000 cri-dockerd[1172]: time="2024-08-31T22:32:22Z" level=info msg="Stop pulling image gcr.io/k8s-minikube/busybox:1.28: Status: Downloaded newer image for gcr.io/k8s-minikube/busybox:1.28"
	Aug 31 22:32:22 ha-949000 dockerd[1279]: time="2024-08-31T22:32:22.716842379Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:32:22 ha-949000 dockerd[1279]: time="2024-08-31T22:32:22.716906766Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:32:22 ha-949000 dockerd[1279]: time="2024-08-31T22:32:22.716920530Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:32:22 ha-949000 dockerd[1279]: time="2024-08-31T22:32:22.721236974Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                 CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	2f925f16b74b0       gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12   3 minutes ago       Running             busybox                   0                   f68483c946835       busybox-7dff88458-5kkbw
	b1db836cd7a3d       cbb01a7bd410d                                                                                         5 minutes ago       Running             coredns                   0                   271da20951c9a       coredns-6f6b679f8f-kjszm
	def4d6bd20bc5       cbb01a7bd410d                                                                                         5 minutes ago       Running             coredns                   0                   1017bd5eac1d2       coredns-6f6b679f8f-snq8s
	22fbb8a8e01ad       6e38f40d628db                                                                                         5 minutes ago       Running             storage-provisioner       0                   7da75377db13c       storage-provisioner
	6d156ce626115       kindest/kindnetd@sha256:e59a687ca28ae274a2fc92f1e2f5f1c739f353178a43a23aafc71adb802ed166              5 minutes ago       Running             kindnet-cni               0                   7d1851c17485c       kindnet-jzj42
	54d5f8041c89d       ad83b2ca7b09e                                                                                         5 minutes ago       Running             kube-proxy                0                   4b0198ac7dc52       kube-proxy-q7ndn
	c99fe831b20c1       ghcr.io/kube-vip/kube-vip@sha256:360f0c5d02322075cc80edb9e4e0d2171e941e55072184f1f902203fafc81d0f     5 minutes ago       Running             kube-vip                  0                   9ef7e0fa361d5       kube-vip-ha-949000
	c734c23a53082       2e96e5913fc06                                                                                         5 minutes ago       Running             etcd                      0                   7cfaf9f5d4dd4       etcd-ha-949000
	02c10e4f765d1       1766f54c897f0                                                                                         5 minutes ago       Running             kube-scheduler            0                   c084f2a259f6c       kube-scheduler-ha-949000
	6670fd34164cb       045733566833c                                                                                         5 minutes ago       Running             kube-controller-manager   0                   f9573e28f9d4d       kube-controller-manager-ha-949000
	ffec6106be6c8       604f5db92eaa8                                                                                         5 minutes ago       Running             kube-apiserver            0                   25c49852f78dc       kube-apiserver-ha-949000
	
	
	==> coredns [b1db836cd7a3] <==
	[INFO] 10.244.1.2:56414 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000107837s
	[INFO] 10.244.1.2:53184 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000079726s
	[INFO] 10.244.1.2:58757 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000418868s
	[INFO] 10.244.1.2:39299 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000067106s
	[INFO] 10.244.2.2:56948 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000080585s
	[INFO] 10.244.2.2:56973 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000078985s
	[INFO] 10.244.2.2:43081 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000100123s
	[INFO] 10.244.2.2:56390 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000040214s
	[INFO] 10.244.2.2:52519 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000061255s
	[INFO] 10.244.0.4:36226 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000151133s
	[INFO] 10.244.1.2:44017 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089111s
	[INFO] 10.244.1.2:37224 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000069144s
	[INFO] 10.244.1.2:51282 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000118723s
	[INFO] 10.244.2.2:35009 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089507s
	[INFO] 10.244.2.2:60607 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000049176s
	[INFO] 10.244.2.2:36851 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000097758s
	[INFO] 10.244.0.4:59717 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000053986s
	[INFO] 10.244.0.4:58447 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000060419s
	[INFO] 10.244.1.2:60381 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000136898s
	[INFO] 10.244.1.2:32783 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.00010303s
	[INFO] 10.244.1.2:44904 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000042493s
	[INFO] 10.244.1.2:44085 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000132084s
	[INFO] 10.244.2.2:43635 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000080947s
	[INFO] 10.244.2.2:40020 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000081919s
	[INFO] 10.244.2.2:53730 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000058015s
	
	
	==> coredns [def4d6bd20bc] <==
	[INFO] 10.244.0.4:41865 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 140 0.008744161s
	[INFO] 10.244.1.2:50080 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000093199s
	[INFO] 10.244.1.2:55576 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.000574417s
	[INFO] 10.244.1.2:36293 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000065455s
	[INFO] 10.244.2.2:41223 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000063892s
	[INFO] 10.244.0.4:54135 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000096141s
	[INFO] 10.244.0.4:39176 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000742646s
	[INFO] 10.244.0.4:58445 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000080113s
	[INFO] 10.244.0.4:56242 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000066269s
	[INFO] 10.244.0.4:60657 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000049645s
	[INFO] 10.244.1.2:48306 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000561931s
	[INFO] 10.244.1.2:40767 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000077826s
	[INFO] 10.244.1.2:35669 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000056994s
	[INFO] 10.244.1.2:57720 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000040565s
	[INFO] 10.244.2.2:38794 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000136901s
	[INFO] 10.244.2.2:33576 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000052374s
	[INFO] 10.244.2.2:57053 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000051289s
	[INFO] 10.244.0.4:47623 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000056903s
	[INFO] 10.244.0.4:59818 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.00003011s
	[INFO] 10.244.0.4:53586 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000029565s
	[INFO] 10.244.1.2:60045 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000060878s
	[INFO] 10.244.2.2:38400 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000078624s
	[INFO] 10.244.0.4:58765 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000075707s
	[INFO] 10.244.0.4:32804 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000050785s
	[INFO] 10.244.2.2:48459 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.00007773s
	
	
	==> describe nodes <==
	Name:               ha-949000
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-949000
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=ha-949000
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_08_31T15_29_45_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:29:41 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-949000
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:35:30 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:32:48 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:32:48 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:32:48 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:32:48 +0000   Sat, 31 Aug 2024 22:30:07 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.5
	  Hostname:    ha-949000
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 e8535f0b09e14aea8b2456a9d977fc80
	  System UUID:                98ca49d1-0000-0000-9e6c-321a4533d56e
	  Boot ID:                    4896b77b-e0f4-43c0-af0e-3998b4352bec
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-5kkbw              0 (0%)        0 (0%)      0 (0%)           0 (0%)         3m14s
	  kube-system                 coredns-6f6b679f8f-kjszm             100m (5%)     0 (0%)      70Mi (3%)        170Mi (8%)     5m45s
	  kube-system                 coredns-6f6b679f8f-snq8s             100m (5%)     0 (0%)      70Mi (3%)        170Mi (8%)     5m45s
	  kube-system                 etcd-ha-949000                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         5m50s
	  kube-system                 kindnet-jzj42                        100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      5m46s
	  kube-system                 kube-apiserver-ha-949000             250m (12%)    0 (0%)      0 (0%)           0 (0%)         5m51s
	  kube-system                 kube-controller-manager-ha-949000    200m (10%)    0 (0%)      0 (0%)           0 (0%)         5m50s
	  kube-system                 kube-proxy-q7ndn                     0 (0%)        0 (0%)      0 (0%)           0 (0%)         5m46s
	  kube-system                 kube-scheduler-ha-949000             100m (5%)     0 (0%)      0 (0%)           0 (0%)         5m52s
	  kube-system                 kube-vip-ha-949000                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         5m52s
	  kube-system                 storage-provisioner                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         5m46s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%)   100m (5%)
	  memory             290Mi (13%)  390Mi (18%)
	  ephemeral-storage  0 (0%)       0 (0%)
	  hugepages-2Mi      0 (0%)       0 (0%)
	Events:
	  Type    Reason                   Age    From             Message
	  ----    ------                   ----   ----             -------
	  Normal  Starting                 5m44s  kube-proxy       
	  Normal  Starting                 5m50s  kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  5m50s  kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  5m50s  kubelet          Node ha-949000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    5m50s  kubelet          Node ha-949000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     5m50s  kubelet          Node ha-949000 status is now: NodeHasSufficientPID
	  Normal  RegisteredNode           5m46s  node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  NodeReady                5m27s  kubelet          Node ha-949000 status is now: NodeReady
	  Normal  RegisteredNode           4m46s  node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           3m36s  node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           65s    node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	
	
	Name:               ha-949000-m02
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-949000-m02
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=ha-949000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_08_31T15_30_43_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:30:41 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-949000-m02
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:35:24 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:34:22 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:34:22 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:34:22 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:34:22 +0000   Sat, 31 Aug 2024 22:31:00 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.6
	  Hostname:    ha-949000-m02
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 44b968080187442b981a33d77e4f86aa
	  System UUID:                23e54f3d-0000-0000-86b7-b25c818528d1
	  Boot ID:                    4ddbe4b0-7ef0-4715-a631-f977c123c463
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-6r9s5                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         3m14s
	  kube-system                 etcd-ha-949000-m02                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         4m51s
	  kube-system                 kindnet-brtj6                            100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      4m53s
	  kube-system                 kube-apiserver-ha-949000-m02             250m (12%)    0 (0%)      0 (0%)           0 (0%)         4m51s
	  kube-system                 kube-controller-manager-ha-949000-m02    200m (10%)    0 (0%)      0 (0%)           0 (0%)         4m48s
	  kube-system                 kube-proxy-4r2bt                         0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m53s
	  kube-system                 kube-scheduler-ha-949000-m02             100m (5%)     0 (0%)      0 (0%)           0 (0%)         4m47s
	  kube-system                 kube-vip-ha-949000-m02                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         4m49s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%)  100m (5%)
	  memory             150Mi (7%)  50Mi (2%)
	  ephemeral-storage  0 (0%)      0 (0%)
	  hugepages-2Mi      0 (0%)      0 (0%)
	Events:
	  Type     Reason                   Age                    From             Message
	  ----     ------                   ----                   ----             -------
	  Normal   Starting                 4m49s                  kube-proxy       
	  Normal   Starting                 68s                    kube-proxy       
	  Normal   NodeHasSufficientMemory  4m53s (x8 over 4m53s)  kubelet          Node ha-949000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    4m53s (x8 over 4m53s)  kubelet          Node ha-949000-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     4m53s (x7 over 4m53s)  kubelet          Node ha-949000-m02 status is now: NodeHasSufficientPID
	  Normal   NodeAllocatableEnforced  4m53s                  kubelet          Updated Node Allocatable limit across pods
	  Normal   RegisteredNode           4m51s                  node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   RegisteredNode           4m46s                  node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   RegisteredNode           3m36s                  node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   Starting                 72s                    kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  72s                    kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  72s                    kubelet          Node ha-949000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    72s                    kubelet          Node ha-949000-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     72s                    kubelet          Node ha-949000-m02 status is now: NodeHasSufficientPID
	  Warning  Rebooted                 72s                    kubelet          Node ha-949000-m02 has been rebooted, boot id: 4ddbe4b0-7ef0-4715-a631-f977c123c463
	  Normal   RegisteredNode           65s                    node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	
	
	Name:               ha-949000-m03
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-949000-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=ha-949000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_08_31T15_31_53_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:31:50 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-949000-m03
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:35:24 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:32:52 +0000   Sat, 31 Aug 2024 22:31:50 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:32:52 +0000   Sat, 31 Aug 2024 22:31:50 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:32:52 +0000   Sat, 31 Aug 2024 22:31:50 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:32:52 +0000   Sat, 31 Aug 2024 22:32:13 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.7
	  Hostname:    ha-949000-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 0aea5b50957a40edad0152e71b7f3a2a
	  System UUID:                3fde4d5b-0000-0000-8412-6ae6e5c787bb
	  Boot ID:                    2d4c31ca-c268-4eb4-ad45-716d78aaaa5c
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.2.0/24
	PodCIDRs:                     10.244.2.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-vjf9x                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         3m14s
	  kube-system                 etcd-ha-949000-m03                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         3m41s
	  kube-system                 kindnet-9j85v                            100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      3m44s
	  kube-system                 kube-apiserver-ha-949000-m03             250m (12%)    0 (0%)      0 (0%)           0 (0%)         3m41s
	  kube-system                 kube-controller-manager-ha-949000-m03    200m (10%)    0 (0%)      0 (0%)           0 (0%)         3m43s
	  kube-system                 kube-proxy-d45q5                         0 (0%)        0 (0%)      0 (0%)           0 (0%)         3m44s
	  kube-system                 kube-scheduler-ha-949000-m03             100m (5%)     0 (0%)      0 (0%)           0 (0%)         3m43s
	  kube-system                 kube-vip-ha-949000-m03                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         3m40s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%)  100m (5%)
	  memory             150Mi (7%)  50Mi (2%)
	  ephemeral-storage  0 (0%)      0 (0%)
	  hugepages-2Mi      0 (0%)      0 (0%)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 3m40s                  kube-proxy       
	  Normal  NodeHasSufficientMemory  3m44s (x8 over 3m44s)  kubelet          Node ha-949000-m03 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    3m44s (x8 over 3m44s)  kubelet          Node ha-949000-m03 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     3m44s (x7 over 3m44s)  kubelet          Node ha-949000-m03 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  3m44s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           3m41s                  node-controller  Node ha-949000-m03 event: Registered Node ha-949000-m03 in Controller
	  Normal  RegisteredNode           3m41s                  node-controller  Node ha-949000-m03 event: Registered Node ha-949000-m03 in Controller
	  Normal  RegisteredNode           3m36s                  node-controller  Node ha-949000-m03 event: Registered Node ha-949000-m03 in Controller
	  Normal  RegisteredNode           65s                    node-controller  Node ha-949000-m03 event: Registered Node ha-949000-m03 in Controller
	
	
	==> dmesg <==
	[  +2.774485] systemd-fstab-generator[127]: Ignoring "noauto" option for root device
	[  +2.237441] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000003] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000001] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +0.596627] systemd-fstab-generator[494]: Ignoring "noauto" option for root device
	[  +0.090743] systemd-fstab-generator[506]: Ignoring "noauto" option for root device
	[  +1.756564] systemd-fstab-generator[845]: Ignoring "noauto" option for root device
	[  +0.273405] systemd-fstab-generator[883]: Ignoring "noauto" option for root device
	[  +0.102089] systemd-fstab-generator[895]: Ignoring "noauto" option for root device
	[  +0.058959] kauditd_printk_skb: 115 callbacks suppressed
	[  +0.059797] systemd-fstab-generator[909]: Ignoring "noauto" option for root device
	[  +2.526421] systemd-fstab-generator[1125]: Ignoring "noauto" option for root device
	[  +0.100331] systemd-fstab-generator[1137]: Ignoring "noauto" option for root device
	[  +0.099114] systemd-fstab-generator[1149]: Ignoring "noauto" option for root device
	[  +0.141519] systemd-fstab-generator[1164]: Ignoring "noauto" option for root device
	[  +3.497423] systemd-fstab-generator[1265]: Ignoring "noauto" option for root device
	[  +0.066902] kauditd_printk_skb: 158 callbacks suppressed
	[  +2.572406] systemd-fstab-generator[1521]: Ignoring "noauto" option for root device
	[  +3.569896] systemd-fstab-generator[1651]: Ignoring "noauto" option for root device
	[  +0.054418] kauditd_printk_skb: 70 callbacks suppressed
	[  +7.004094] systemd-fstab-generator[2150]: Ignoring "noauto" option for root device
	[  +0.086539] kauditd_printk_skb: 72 callbacks suppressed
	[  +5.400345] kauditd_printk_skb: 12 callbacks suppressed
	[  +5.311598] kauditd_printk_skb: 29 callbacks suppressed
	[Aug31 22:30] kauditd_printk_skb: 26 callbacks suppressed
	
	
	==> etcd [c734c23a5308] <==
	{"level":"warn","ts":"2024-08-31T22:34:11.718634Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:11.743788Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:11.791850Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:11.818353Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:11.918542Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:12.017840Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:12.118676Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:12.217677Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:34:13.049659Z","caller":"etcdserver/cluster_util.go:294","msg":"failed to reach the peer URL","address":"https://192.169.0.6:2380/version","remote-member-id":"316786cc150e7430","error":"Get \"https://192.169.0.6:2380/version\": dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-08-31T22:34:13.049874Z","caller":"etcdserver/cluster_util.go:158","msg":"failed to get version","remote-member-id":"316786cc150e7430","error":"Get \"https://192.169.0.6:2380/version\": dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-08-31T22:34:16.551277Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"316786cc150e7430","rtt":"6.281624ms","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-08-31T22:34:16.551290Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"316786cc150e7430","rtt":"647.596µs","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-08-31T22:34:17.052755Z","caller":"etcdserver/cluster_util.go:294","msg":"failed to reach the peer URL","address":"https://192.169.0.6:2380/version","remote-member-id":"316786cc150e7430","error":"Get \"https://192.169.0.6:2380/version\": dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-08-31T22:34:17.052804Z","caller":"etcdserver/cluster_util.go:158","msg":"failed to get version","remote-member-id":"316786cc150e7430","error":"Get \"https://192.169.0.6:2380/version\": dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-08-31T22:34:21.055590Z","caller":"etcdserver/cluster_util.go:294","msg":"failed to reach the peer URL","address":"https://192.169.0.6:2380/version","remote-member-id":"316786cc150e7430","error":"Get \"https://192.169.0.6:2380/version\": dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-08-31T22:34:21.055652Z","caller":"etcdserver/cluster_util.go:158","msg":"failed to get version","remote-member-id":"316786cc150e7430","error":"Get \"https://192.169.0.6:2380/version\": dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-08-31T22:34:21.552021Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"316786cc150e7430","rtt":"6.281624ms","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-08-31T22:34:21.552155Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"316786cc150e7430","rtt":"647.596µs","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"info","ts":"2024-08-31T22:34:24.571395Z","caller":"rafthttp/peer_status.go:53","msg":"peer became active","peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:34:24.571442Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:34:24.631480Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"316786cc150e7430","stream-type":"stream MsgApp v2"}
	{"level":"info","ts":"2024-08-31T22:34:24.631525Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:34:24.635489Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:34:24.660962Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"316786cc150e7430","stream-type":"stream Message"}
	{"level":"info","ts":"2024-08-31T22:34:24.661005Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430"}
	
	
	==> kernel <==
	 22:35:35 up 6 min,  0 users,  load average: 0.26, 0.22, 0.11
	Linux ha-949000 5.10.207 #1 SMP Wed Aug 28 20:54:17 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [6d156ce62611] <==
	I0831 22:34:45.622428       1 main.go:299] handling current node
	I0831 22:34:55.614042       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:34:55.614115       1 main.go:299] handling current node
	I0831 22:34:55.614136       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:34:55.614144       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:34:55.614519       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:34:55.614660       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:35:05.622719       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:35:05.622775       1 main.go:299] handling current node
	I0831 22:35:05.622786       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:35:05.622791       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:35:05.622902       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:35:05.622929       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:35:15.620526       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:35:15.620551       1 main.go:299] handling current node
	I0831 22:35:15.620562       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:35:15.620567       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:35:15.620685       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:35:15.620720       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:35:25.613908       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:35:25.614028       1 main.go:299] handling current node
	I0831 22:35:25.614079       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:35:25.614094       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:35:25.614736       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:35:25.614790       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	
	
	==> kube-apiserver [ffec6106be6c] <==
	I0831 22:29:42.351464       1 controller.go:615] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0831 22:29:42.447047       1 alloc.go:330] "allocated clusterIPs" service="default/kubernetes" clusterIPs={"IPv4":"10.96.0.1"}
	W0831 22:29:42.450860       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.169.0.5]
	I0831 22:29:42.451599       1 controller.go:615] quota admission added evaluator for: endpoints
	I0831 22:29:42.454145       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0831 22:29:43.117776       1 controller.go:615] quota admission added evaluator for: serviceaccounts
	I0831 22:29:44.628868       1 controller.go:615] quota admission added evaluator for: deployments.apps
	I0831 22:29:44.643482       1 alloc.go:330] "allocated clusterIPs" service="kube-system/kube-dns" clusterIPs={"IPv4":"10.96.0.10"}
	I0831 22:29:44.649286       1 controller.go:615] quota admission added evaluator for: daemonsets.apps
	I0831 22:29:48.568363       1 controller.go:615] quota admission added evaluator for: replicasets.apps
	I0831 22:29:48.768446       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	E0831 22:32:24.583976       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51190: use of closed network connection
	E0831 22:32:24.787019       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51192: use of closed network connection
	E0831 22:32:24.994355       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51194: use of closed network connection
	E0831 22:32:25.183977       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51196: use of closed network connection
	E0831 22:32:25.381277       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51198: use of closed network connection
	E0831 22:32:25.569952       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51200: use of closed network connection
	E0831 22:32:25.763008       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51202: use of closed network connection
	E0831 22:32:25.965367       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51204: use of closed network connection
	E0831 22:32:26.154701       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51206: use of closed network connection
	E0831 22:32:26.694309       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51211: use of closed network connection
	E0831 22:32:26.880399       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51213: use of closed network connection
	E0831 22:32:27.077320       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51215: use of closed network connection
	E0831 22:32:27.267610       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51217: use of closed network connection
	E0831 22:32:27.476005       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51219: use of closed network connection
	
	
	==> kube-controller-manager [6670fd34164c] <==
	I0831 22:32:13.164123       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:32:20.074086       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="91.437594ms"
	I0831 22:32:20.089117       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="14.696904ms"
	I0831 22:32:20.155832       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="66.417676ms"
	I0831 22:32:20.247938       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="91.617712ms"
	E0831 22:32:20.248480       1 replica_set.go:560] "Unhandled Error" err="sync \"default/busybox-7dff88458\" failed with Operation cannot be fulfilled on replicasets.apps \"busybox-7dff88458\": the object has been modified; please apply your changes to the latest version and try again" logger="UnhandledError"
	I0831 22:32:20.257744       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="7.890782ms"
	I0831 22:32:20.258053       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="29.491µs"
	I0831 22:32:20.352807       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="29.639µs"
	I0831 22:32:21.164054       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:32:21.310383       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="34.795µs"
	I0831 22:32:22.115926       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="5.066721ms"
	I0831 22:32:22.116004       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="26.449µs"
	I0831 22:32:23.502335       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="6.289855ms"
	I0831 22:32:23.502432       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="58.061µs"
	I0831 22:32:24.043757       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="4.626106ms"
	I0831 22:32:24.044703       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="46.785µs"
	I0831 22:32:44.005602       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m02"
	I0831 22:32:48.178405       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000"
	I0831 22:32:52.115444       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:34:23.407685       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m02"
	I0831 22:34:24.307976       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="39.638356ms"
	I0831 22:34:24.308054       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="36.891µs"
	I0831 22:34:26.919796       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="6.03118ms"
	I0831 22:34:26.919865       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="25.273µs"
	
	
	==> kube-proxy [54d5f8041c89] <==
		add table ip kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	E0831 22:29:49.977338       1 proxier.go:734] "Error cleaning up nftables rules" err=<
		could not run nftables command: /dev/stdin:1:1-25: Error: Could not process rule: Operation not supported
		add table ip6 kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	I0831 22:29:49.983071       1 server.go:677] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	E0831 22:29:49.983430       1 server.go:234] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I0831 22:29:50.023032       1 server_linux.go:146] "No iptables support for family" ipFamily="IPv6"
	I0831 22:29:50.023054       1 server.go:245] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0831 22:29:50.023070       1 server_linux.go:169] "Using iptables Proxier"
	I0831 22:29:50.025790       1 proxier.go:255] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I0831 22:29:50.026014       1 server.go:483] "Version info" version="v1.31.0"
	I0831 22:29:50.026061       1 server.go:485] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:29:50.026844       1 config.go:197] "Starting service config controller"
	I0831 22:29:50.027602       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0831 22:29:50.027141       1 config.go:104] "Starting endpoint slice config controller"
	I0831 22:29:50.027698       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0831 22:29:50.027260       1 config.go:326] "Starting node config controller"
	I0831 22:29:50.027720       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0831 22:29:50.128122       1 shared_informer.go:320] Caches are synced for node config
	I0831 22:29:50.128144       1 shared_informer.go:320] Caches are synced for service config
	I0831 22:29:50.128162       1 shared_informer.go:320] Caches are synced for endpoint slice config
	
	
	==> kube-scheduler [02c10e4f765d] <==
	W0831 22:29:42.107023       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0831 22:29:42.107231       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicasets\" in API group \"apps\" at the cluster scope" logger="UnhandledError"
	W0831 22:29:42.111966       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	E0831 22:29:42.112045       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumes\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:29:42.116498       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	E0831 22:29:42.116539       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Namespace: failed to list *v1.Namespace: namespaces is forbidden: User \"system:kube-scheduler\" cannot list resource \"namespaces\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:29:42.129701       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0831 22:29:42.129741       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumeclaims\" in API group \"\" at the cluster scope" logger="UnhandledError"
	I0831 22:29:45.342252       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	E0831 22:31:50.464567       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kube-proxy-d45q5\": pod kube-proxy-d45q5 is already assigned to node \"ha-949000-m03\"" plugin="DefaultBinder" pod="kube-system/kube-proxy-d45q5" node="ha-949000-m03"
	E0831 22:31:50.464652       1 schedule_one.go:348] "scheduler cache ForgetPod failed" err="pod 9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2(kube-system/kube-proxy-d45q5) wasn't assumed so cannot be forgotten" pod="kube-system/kube-proxy-d45q5"
	E0831 22:31:50.464667       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kube-proxy-d45q5\": pod kube-proxy-d45q5 is already assigned to node \"ha-949000-m03\"" pod="kube-system/kube-proxy-d45q5"
	I0831 22:31:50.464683       1 schedule_one.go:1070] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kube-proxy-d45q5" node="ha-949000-m03"
	E0831 22:31:50.476710       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kindnet-l4zbh\": pod kindnet-l4zbh is already assigned to node \"ha-949000-m03\"" plugin="DefaultBinder" pod="kube-system/kindnet-l4zbh" node="ha-949000-m03"
	E0831 22:31:50.476756       1 schedule_one.go:348] "scheduler cache ForgetPod failed" err="pod c551bb18-9a7d-4fca-9724-be7900980a40(kube-system/kindnet-l4zbh) wasn't assumed so cannot be forgotten" pod="kube-system/kindnet-l4zbh"
	E0831 22:31:50.476767       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kindnet-l4zbh\": pod kindnet-l4zbh is already assigned to node \"ha-949000-m03\"" pod="kube-system/kindnet-l4zbh"
	I0831 22:31:50.476781       1 schedule_one.go:1070] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kindnet-l4zbh" node="ha-949000-m03"
	E0831 22:32:20.049491       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-6r9s5\": pod busybox-7dff88458-6r9s5 is already assigned to node \"ha-949000-m02\"" plugin="DefaultBinder" pod="default/busybox-7dff88458-6r9s5" node="ha-949000-m02"
	E0831 22:32:20.049618       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-6r9s5\": pod busybox-7dff88458-6r9s5 is already assigned to node \"ha-949000-m02\"" pod="default/busybox-7dff88458-6r9s5"
	E0831 22:32:20.071235       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-vjf9x\": pod busybox-7dff88458-vjf9x is already assigned to node \"ha-949000-m03\"" plugin="DefaultBinder" pod="default/busybox-7dff88458-vjf9x" node="ha-949000-m03"
	E0831 22:32:20.071466       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-vjf9x\": pod busybox-7dff88458-vjf9x is already assigned to node \"ha-949000-m03\"" pod="default/busybox-7dff88458-vjf9x"
	E0831 22:32:20.073498       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-5kkbw\": pod busybox-7dff88458-5kkbw is already assigned to node \"ha-949000\"" plugin="DefaultBinder" pod="default/busybox-7dff88458-5kkbw" node="ha-949000"
	E0831 22:32:20.073571       1 schedule_one.go:348] "scheduler cache ForgetPod failed" err="pod e97e21d8-a69e-451c-babd-6232e12aafe0(default/busybox-7dff88458-5kkbw) wasn't assumed so cannot be forgotten" pod="default/busybox-7dff88458-5kkbw"
	E0831 22:32:20.077323       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-5kkbw\": pod busybox-7dff88458-5kkbw is already assigned to node \"ha-949000\"" pod="default/busybox-7dff88458-5kkbw"
	I0831 22:32:20.077394       1 schedule_one.go:1070] "Pod has been assigned to node. Abort adding it back to queue." pod="default/busybox-7dff88458-5kkbw" node="ha-949000"
	
	
	==> kubelet <==
	Aug 31 22:30:44 ha-949000 kubelet[2157]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:30:44 ha-949000 kubelet[2157]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:31:44 ha-949000 kubelet[2157]: E0831 22:31:44.490275    2157 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:31:44 ha-949000 kubelet[2157]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:31:44 ha-949000 kubelet[2157]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:31:44 ha-949000 kubelet[2157]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:31:44 ha-949000 kubelet[2157]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:32:20 ha-949000 kubelet[2157]: W0831 22:32:20.081132    2157 reflector.go:561] object-"default"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ha-949000" cannot list resource "configmaps" in API group "" in the namespace "default": no relationship found between node 'ha-949000' and this object
	Aug 31 22:32:20 ha-949000 kubelet[2157]: E0831 22:32:20.081252    2157 reflector.go:158] "Unhandled Error" err="object-\"default\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ha-949000\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"default\": no relationship found between node 'ha-949000' and this object" logger="UnhandledError"
	Aug 31 22:32:20 ha-949000 kubelet[2157]: I0831 22:32:20.223174    2157 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l95k\" (UniqueName: \"kubernetes.io/projected/e97e21d8-a69e-451c-babd-6232e12aafe0-kube-api-access-6l95k\") pod \"busybox-7dff88458-5kkbw\" (UID: \"e97e21d8-a69e-451c-babd-6232e12aafe0\") " pod="default/busybox-7dff88458-5kkbw"
	Aug 31 22:32:44 ha-949000 kubelet[2157]: E0831 22:32:44.489812    2157 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:32:44 ha-949000 kubelet[2157]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:32:44 ha-949000 kubelet[2157]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:32:44 ha-949000 kubelet[2157]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:32:44 ha-949000 kubelet[2157]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:33:44 ha-949000 kubelet[2157]: E0831 22:33:44.492393    2157 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:33:44 ha-949000 kubelet[2157]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:33:44 ha-949000 kubelet[2157]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:33:44 ha-949000 kubelet[2157]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:33:44 ha-949000 kubelet[2157]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:34:44 ha-949000 kubelet[2157]: E0831 22:34:44.489301    2157 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:34:44 ha-949000 kubelet[2157]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:34:44 ha-949000 kubelet[2157]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:34:44 ha-949000 kubelet[2157]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:34:44 ha-949000 kubelet[2157]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:255: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p ha-949000 -n ha-949000
helpers_test.go:262: (dbg) Run:  kubectl --context ha-949000 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:286: <<< TestMultiControlPlane/serial/RestartSecondaryNode FAILED: end of post-mortem logs <<<
helpers_test.go:287: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/RestartSecondaryNode (93.43s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartClusterKeepsNodes (408.58s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartClusterKeepsNodes
ha_test.go:456: (dbg) Run:  out/minikube-darwin-amd64 node list -p ha-949000 -v=7 --alsologtostderr
ha_test.go:462: (dbg) Run:  out/minikube-darwin-amd64 stop -p ha-949000 -v=7 --alsologtostderr
E0831 15:35:36.588194    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
ha_test.go:462: (dbg) Done: out/minikube-darwin-amd64 stop -p ha-949000 -v=7 --alsologtostderr: (33.142199096s)
ha_test.go:467: (dbg) Run:  out/minikube-darwin-amd64 start -p ha-949000 --wait=true -v=7 --alsologtostderr
E0831 15:37:52.720842    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:38:20.432044    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:39:15.443840    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:40:38.523429    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
ha_test.go:467: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p ha-949000 --wait=true -v=7 --alsologtostderr: exit status 80 (6m10.231513314s)

                                                
                                                
-- stdout --
	* [ha-949000] minikube v1.33.1 on Darwin 14.6.1
	  - MINIKUBE_LOCATION=18943
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on existing profile
	* Starting "ha-949000" primary control-plane node in "ha-949000" cluster
	* Restarting existing hyperkit VM for "ha-949000" ...
	* Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	* Enabled addons: 
	
	* Starting "ha-949000-m02" control-plane node in "ha-949000" cluster
	* Restarting existing hyperkit VM for "ha-949000-m02" ...
	* Found network options:
	  - NO_PROXY=192.169.0.5
	* Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	  - env NO_PROXY=192.169.0.5
	* Verifying Kubernetes components...
	
	* Starting "ha-949000-m03" control-plane node in "ha-949000" cluster
	* Restarting existing hyperkit VM for "ha-949000-m03" ...
	* Found network options:
	  - NO_PROXY=192.169.0.5,192.169.0.6
	* Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	  - env NO_PROXY=192.169.0.5
	  - env NO_PROXY=192.169.0.5,192.169.0.6
	* Verifying Kubernetes components...
	
	* Starting "ha-949000-m04" worker node in "ha-949000" cluster
	* Restarting existing hyperkit VM for "ha-949000-m04" ...
	* Found network options:
	  - NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	* Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	  - env NO_PROXY=192.169.0.5
	  - env NO_PROXY=192.169.0.5,192.169.0.6
	  - env NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	* Verifying Kubernetes components...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 15:36:09.764310    3744 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:36:09.764592    3744 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:36:09.764597    3744 out.go:358] Setting ErrFile to fd 2...
	I0831 15:36:09.764601    3744 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:36:09.764770    3744 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:36:09.766289    3744 out.go:352] Setting JSON to false
	I0831 15:36:09.790255    3744 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":2140,"bootTime":1725141629,"procs":434,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.6.1","kernelVersion":"23.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0831 15:36:09.790362    3744 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0831 15:36:09.812967    3744 out.go:177] * [ha-949000] minikube v1.33.1 on Darwin 14.6.1
	I0831 15:36:09.857017    3744 out.go:177]   - MINIKUBE_LOCATION=18943
	I0831 15:36:09.857063    3744 notify.go:220] Checking for updates...
	I0831 15:36:09.900714    3744 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:36:09.921979    3744 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0831 15:36:09.948841    3744 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0831 15:36:09.970509    3744 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:36:09.991512    3744 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0831 15:36:10.013794    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:36:10.013954    3744 driver.go:392] Setting default libvirt URI to qemu:///system
	I0831 15:36:10.014628    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:36:10.014709    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:36:10.024181    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51800
	I0831 15:36:10.024557    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:36:10.024973    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:36:10.024981    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:36:10.025208    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:36:10.025338    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:10.053425    3744 out.go:177] * Using the hyperkit driver based on existing profile
	I0831 15:36:10.095518    3744 start.go:297] selected driver: hyperkit
	I0831 15:36:10.095547    3744 start.go:901] validating driver "hyperkit" against &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime: ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:fals
e efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p20
00.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:36:10.095803    3744 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0831 15:36:10.095991    3744 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:36:10.096192    3744 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18943-957/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0831 15:36:10.105897    3744 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0831 15:36:10.111634    3744 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:36:10.111657    3744 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0831 15:36:10.114891    3744 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:36:10.114962    3744 cni.go:84] Creating CNI manager for ""
	I0831 15:36:10.114970    3744 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0831 15:36:10.115051    3744 start.go:340] cluster config:
	{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.16
9.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-t
iller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0
MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:36:10.115155    3744 iso.go:125] acquiring lock: {Name:mk6e91575b208577856769ef01f8e000bc57c787 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:36:10.157575    3744 out.go:177] * Starting "ha-949000" primary control-plane node in "ha-949000" cluster
	I0831 15:36:10.178565    3744 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:36:10.178634    3744 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4
	I0831 15:36:10.178661    3744 cache.go:56] Caching tarball of preloaded images
	I0831 15:36:10.178859    3744 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:36:10.178882    3744 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:36:10.179080    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:36:10.179968    3744 start.go:360] acquireMachinesLock for ha-949000: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:36:10.180093    3744 start.go:364] duration metric: took 100.253µs to acquireMachinesLock for "ha-949000"
	I0831 15:36:10.180125    3744 start.go:96] Skipping create...Using existing machine configuration
	I0831 15:36:10.180144    3744 fix.go:54] fixHost starting: 
	I0831 15:36:10.180570    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:36:10.180626    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:36:10.189873    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51802
	I0831 15:36:10.190215    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:36:10.190587    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:36:10.190602    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:36:10.190832    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:36:10.190956    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:10.191047    3744 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:36:10.191129    3744 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:36:10.191205    3744 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:36:10.192132    3744 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid 2887 missing from process table
	I0831 15:36:10.192166    3744 fix.go:112] recreateIfNeeded on ha-949000: state=Stopped err=<nil>
	I0831 15:36:10.192185    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	W0831 15:36:10.192270    3744 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 15:36:10.235417    3744 out.go:177] * Restarting existing hyperkit VM for "ha-949000" ...
	I0831 15:36:10.258400    3744 main.go:141] libmachine: (ha-949000) Calling .Start
	I0831 15:36:10.258670    3744 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:36:10.258717    3744 main.go:141] libmachine: (ha-949000) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid
	I0831 15:36:10.260851    3744 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid 2887 missing from process table
	I0831 15:36:10.260866    3744 main.go:141] libmachine: (ha-949000) DBG | pid 2887 is in state "Stopped"
	I0831 15:36:10.260894    3744 main.go:141] libmachine: (ha-949000) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid...
	I0831 15:36:10.261058    3744 main.go:141] libmachine: (ha-949000) DBG | Using UUID 98cab9ba-901d-49d1-9e6c-321a4533d56e
	I0831 15:36:10.370955    3744 main.go:141] libmachine: (ha-949000) DBG | Generated MAC ce:8:77:f7:42:5e
	I0831 15:36:10.370980    3744 main.go:141] libmachine: (ha-949000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:36:10.371093    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"98cab9ba-901d-49d1-9e6c-321a4533d56e", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003a6900)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:36:10.371127    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"98cab9ba-901d-49d1-9e6c-321a4533d56e", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003a6900)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:36:10.371175    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "98cab9ba-901d-49d1-9e6c-321a4533d56e", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd,earlyprintk=serial l
oglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:36:10.371220    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 98cab9ba-901d-49d1-9e6c-321a4533d56e -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset noresto
re waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:36:10.371232    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:36:10.372813    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 DEBUG: hyperkit: Pid is 3756
	I0831 15:36:10.373286    3744 main.go:141] libmachine: (ha-949000) DBG | Attempt 0
	I0831 15:36:10.373298    3744 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:36:10.373398    3744 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 3756
	I0831 15:36:10.375146    3744 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:36:10.375210    3744 main.go:141] libmachine: (ha-949000) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0831 15:36:10.375229    3744 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ebe4}
	I0831 15:36:10.375249    3744 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d4eb85}
	I0831 15:36:10.375272    3744 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d4eb32}
	I0831 15:36:10.375287    3744 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:36:10.375330    3744 main.go:141] libmachine: (ha-949000) DBG | Found match: ce:8:77:f7:42:5e
	I0831 15:36:10.375341    3744 main.go:141] libmachine: (ha-949000) Calling .GetConfigRaw
	I0831 15:36:10.375350    3744 main.go:141] libmachine: (ha-949000) DBG | IP: 192.169.0.5
	I0831 15:36:10.376038    3744 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:36:10.376245    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:36:10.376722    3744 machine.go:93] provisionDockerMachine start ...
	I0831 15:36:10.376735    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:10.376898    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:10.377023    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:10.377121    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:10.377226    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:10.377318    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:10.377457    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:10.377688    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:36:10.377699    3744 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 15:36:10.380749    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:36:10.432938    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:36:10.433650    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:36:10.433669    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:36:10.433677    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:36:10.433685    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:36:10.813736    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:36:10.813750    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:36:10.928786    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:36:10.928808    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:36:10.928820    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:36:10.928840    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:36:10.929718    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:36:10.929729    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:36:16.483580    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:16 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:36:16.483594    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:16 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:36:16.483602    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:16 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:36:16.508100    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:16 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:36:21.446393    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 15:36:21.446406    3744 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:36:21.446553    3744 buildroot.go:166] provisioning hostname "ha-949000"
	I0831 15:36:21.446562    3744 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:36:21.446665    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:21.446786    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:21.446905    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.447025    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.447124    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:21.447308    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:21.447472    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:36:21.447480    3744 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000 && echo "ha-949000" | sudo tee /etc/hostname
	I0831 15:36:21.524007    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000
	
	I0831 15:36:21.524025    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:21.524158    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:21.524268    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.524375    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.524479    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:21.524631    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:21.524781    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:36:21.524792    3744 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:36:21.591782    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:36:21.591802    3744 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:36:21.591822    3744 buildroot.go:174] setting up certificates
	I0831 15:36:21.591828    3744 provision.go:84] configureAuth start
	I0831 15:36:21.591834    3744 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:36:21.591970    3744 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:36:21.592077    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:21.592183    3744 provision.go:143] copyHostCerts
	I0831 15:36:21.592217    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:36:21.592287    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:36:21.592295    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:36:21.592443    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:36:21.592667    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:36:21.592706    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:36:21.592710    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:36:21.592784    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:36:21.592937    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:36:21.592978    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:36:21.592983    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:36:21.593095    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:36:21.593248    3744 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000 san=[127.0.0.1 192.169.0.5 ha-949000 localhost minikube]
	I0831 15:36:21.710940    3744 provision.go:177] copyRemoteCerts
	I0831 15:36:21.710993    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:36:21.711008    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:21.711135    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:21.711246    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.711328    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:21.711434    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:36:21.747436    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:36:21.747514    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:36:21.767330    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:36:21.767390    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1196 bytes)
	I0831 15:36:21.787147    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:36:21.787210    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0831 15:36:21.806851    3744 provision.go:87] duration metric: took 215.008206ms to configureAuth
	I0831 15:36:21.806864    3744 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:36:21.807028    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:36:21.807041    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:21.807176    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:21.807304    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:21.807387    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.807476    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.807574    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:21.807684    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:21.807812    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:36:21.807819    3744 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:36:21.869123    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:36:21.869137    3744 buildroot.go:70] root file system type: tmpfs
	I0831 15:36:21.869215    3744 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:36:21.869228    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:21.869368    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:21.869456    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.869553    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.869651    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:21.869776    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:21.869915    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:36:21.869959    3744 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:36:21.941116    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:36:21.941136    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:21.941270    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:21.941365    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.941441    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.941529    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:21.941663    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:21.941807    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:36:21.941819    3744 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:36:23.639328    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:36:23.639343    3744 machine.go:96] duration metric: took 13.26247014s to provisionDockerMachine
	I0831 15:36:23.639354    3744 start.go:293] postStartSetup for "ha-949000" (driver="hyperkit")
	I0831 15:36:23.639362    3744 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:36:23.639372    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:23.639572    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:36:23.639587    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:23.639684    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:23.639792    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:23.639927    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:23.640026    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:36:23.679356    3744 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:36:23.683676    3744 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:36:23.683690    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:36:23.683793    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:36:23.683980    3744 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:36:23.683987    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:36:23.684187    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:36:23.697074    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:36:23.724665    3744 start.go:296] duration metric: took 85.300709ms for postStartSetup
	I0831 15:36:23.724694    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:23.724869    3744 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0831 15:36:23.724883    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:23.724980    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:23.725089    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:23.725189    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:23.725280    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:36:23.763464    3744 machine.go:197] restoring vm config from /var/lib/minikube/backup: [etc]
	I0831 15:36:23.763527    3744 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0831 15:36:23.797396    3744 fix.go:56] duration metric: took 13.617113477s for fixHost
	I0831 15:36:23.797420    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:23.797554    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:23.797655    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:23.797749    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:23.797839    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:23.797970    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:23.798114    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:36:23.798122    3744 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:36:23.858158    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143783.919023246
	
	I0831 15:36:23.858170    3744 fix.go:216] guest clock: 1725143783.919023246
	I0831 15:36:23.858175    3744 fix.go:229] Guest: 2024-08-31 15:36:23.919023246 -0700 PDT Remote: 2024-08-31 15:36:23.79741 -0700 PDT m=+14.070978631 (delta=121.613246ms)
	I0831 15:36:23.858196    3744 fix.go:200] guest clock delta is within tolerance: 121.613246ms
	I0831 15:36:23.858200    3744 start.go:83] releasing machines lock for "ha-949000", held for 13.677948956s
	I0831 15:36:23.858225    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:23.858359    3744 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:36:23.858452    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:23.858730    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:23.858831    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:23.858919    3744 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:36:23.858951    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:23.858972    3744 ssh_runner.go:195] Run: cat /version.json
	I0831 15:36:23.858983    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:23.859063    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:23.859085    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:23.859194    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:23.859214    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:23.859295    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:23.859309    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:23.859385    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:36:23.859397    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:36:23.890757    3744 ssh_runner.go:195] Run: systemctl --version
	I0831 15:36:23.938659    3744 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0831 15:36:23.943864    3744 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:36:23.943901    3744 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:36:23.956026    3744 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:36:23.956039    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:36:23.956147    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:36:23.971422    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:36:23.980435    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:36:23.989142    3744 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:36:23.989181    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:36:23.997930    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:36:24.006635    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:36:24.015080    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:36:24.023671    3744 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:36:24.032589    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:36:24.041364    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:36:24.050087    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:36:24.058866    3744 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:36:24.066704    3744 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:36:24.074622    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:24.168184    3744 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:36:24.187633    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:36:24.187713    3744 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:36:24.206675    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:36:24.220212    3744 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:36:24.240424    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:36:24.250685    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:36:24.261052    3744 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:36:24.286854    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:36:24.297197    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:36:24.312454    3744 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:36:24.315602    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:36:24.323102    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:36:24.337130    3744 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:36:24.434813    3744 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:36:24.537809    3744 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:36:24.537887    3744 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:36:24.552112    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:24.656146    3744 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:36:26.992775    3744 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.336585914s)
	I0831 15:36:26.992844    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:36:27.003992    3744 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:36:27.018708    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:36:27.029918    3744 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:36:27.137311    3744 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:36:27.239047    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:27.342173    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:36:27.356192    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:36:27.367097    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:27.470187    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:36:27.536105    3744 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:36:27.536192    3744 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:36:27.540763    3744 start.go:563] Will wait 60s for crictl version
	I0831 15:36:27.540810    3744 ssh_runner.go:195] Run: which crictl
	I0831 15:36:27.544037    3744 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:36:27.570291    3744 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:36:27.570367    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:36:27.588378    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:36:27.648285    3744 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:36:27.648336    3744 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:36:27.648820    3744 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:36:27.653344    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:36:27.662997    3744 kubeadm.go:883] updating cluster {Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.
0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false f
reshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID
:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0831 15:36:27.663083    3744 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:36:27.663134    3744 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 15:36:27.676654    3744 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240813-c6f155d6
	registry.k8s.io/kube-apiserver:v1.31.0
	registry.k8s.io/kube-controller-manager:v1.31.0
	registry.k8s.io/kube-scheduler:v1.31.0
	registry.k8s.io/kube-proxy:v1.31.0
	registry.k8s.io/etcd:3.5.15-0
	registry.k8s.io/pause:3.10
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/coredns/coredns:v1.11.1
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0831 15:36:27.676670    3744 docker.go:615] Images already preloaded, skipping extraction
	I0831 15:36:27.676747    3744 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 15:36:27.690446    3744 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240813-c6f155d6
	registry.k8s.io/kube-apiserver:v1.31.0
	registry.k8s.io/kube-controller-manager:v1.31.0
	registry.k8s.io/kube-scheduler:v1.31.0
	registry.k8s.io/kube-proxy:v1.31.0
	registry.k8s.io/etcd:3.5.15-0
	registry.k8s.io/pause:3.10
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/coredns/coredns:v1.11.1
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0831 15:36:27.690466    3744 cache_images.go:84] Images are preloaded, skipping loading
	I0831 15:36:27.690484    3744 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.31.0 docker true true} ...
	I0831 15:36:27.690565    3744 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:36:27.690634    3744 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0831 15:36:27.729077    3744 cni.go:84] Creating CNI manager for ""
	I0831 15:36:27.729090    3744 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0831 15:36:27.729101    3744 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0831 15:36:27.729122    3744 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.31.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-949000 NodeName:ha-949000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0831 15:36:27.729202    3744 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-949000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.31.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0831 15:36:27.729215    3744 kube-vip.go:115] generating kube-vip config ...
	I0831 15:36:27.729267    3744 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:36:27.741901    3744 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:36:27.741972    3744 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:36:27.742025    3744 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:36:27.751754    3744 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:36:27.751799    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0831 15:36:27.759784    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0831 15:36:27.773166    3744 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:36:27.786640    3744 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0831 15:36:27.800639    3744 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:36:27.814083    3744 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:36:27.817014    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:36:27.827332    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:27.924726    3744 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:36:27.939552    3744 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.5
	I0831 15:36:27.939571    3744 certs.go:194] generating shared ca certs ...
	I0831 15:36:27.939581    3744 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:36:27.939767    3744 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:36:27.939836    3744 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:36:27.939848    3744 certs.go:256] generating profile certs ...
	I0831 15:36:27.939960    3744 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:36:27.939980    3744 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.f0a126f7
	I0831 15:36:27.939996    3744 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.f0a126f7 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.7 192.169.0.254]
	I0831 15:36:27.990143    3744 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.f0a126f7 ...
	I0831 15:36:27.990157    3744 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.f0a126f7: {Name:mkcaa83b4b223ea37e242b23bc80c554e3269eac Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:36:27.990861    3744 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.f0a126f7 ...
	I0831 15:36:27.990872    3744 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.f0a126f7: {Name:mk789cab6bc4fccb81a6d827e090943e3a032cb6 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:36:27.991117    3744 certs.go:381] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.f0a126f7 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt
	I0831 15:36:27.991353    3744 certs.go:385] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.f0a126f7 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key
	I0831 15:36:27.991605    3744 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:36:27.991615    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:36:27.991642    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:36:27.991663    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:36:27.991688    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:36:27.991706    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:36:27.991724    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:36:27.991744    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:36:27.991761    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:36:27.991852    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:36:27.991900    3744 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:36:27.991909    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:36:27.991937    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:36:27.991968    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:36:27.992001    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:36:27.992071    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:36:27.992107    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:36:27.992134    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:27.992153    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:36:27.992665    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:36:28.012619    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:36:28.037918    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:36:28.059676    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:36:28.085374    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0831 15:36:28.108665    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0831 15:36:28.134880    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:36:28.163351    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:36:28.189443    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:36:28.237208    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:36:28.275840    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:36:28.307738    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0831 15:36:28.327147    3744 ssh_runner.go:195] Run: openssl version
	I0831 15:36:28.332485    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:36:28.341869    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:28.345319    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:28.345361    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:28.356453    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:36:28.366034    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:36:28.375170    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:36:28.378621    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:36:28.378656    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:36:28.382855    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:36:28.392032    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:36:28.401330    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:36:28.404932    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:36:28.404981    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:36:28.409135    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:36:28.418467    3744 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:36:28.421857    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0831 15:36:28.426311    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0831 15:36:28.430575    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0831 15:36:28.435252    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0831 15:36:28.439597    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0831 15:36:28.443958    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0831 15:36:28.448329    3744 kubeadm.go:392] StartCluster: {Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 C
lusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false fres
hpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:do
cker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:36:28.448445    3744 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0831 15:36:28.461457    3744 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0831 15:36:28.469983    3744 kubeadm.go:408] found existing configuration files, will attempt cluster restart
	I0831 15:36:28.469994    3744 kubeadm.go:593] restartPrimaryControlPlane start ...
	I0831 15:36:28.470033    3744 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0831 15:36:28.478435    3744 kubeadm.go:130] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:36:28.478738    3744 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-949000" does not appear in /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:36:28.478830    3744 kubeconfig.go:62] /Users/jenkins/minikube-integration/18943-957/kubeconfig needs updating (will repair): [kubeconfig missing "ha-949000" cluster setting kubeconfig missing "ha-949000" context setting]
	I0831 15:36:28.479071    3744 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/kubeconfig: {Name:mkc7259a3f17d77b84078e55eed4ed8b5d2486ca Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:36:28.479445    3744 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:36:28.479626    3744 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.5:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, Use
rAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0xfc63c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0831 15:36:28.479933    3744 cert_rotation.go:140] Starting client certificate rotation controller
	I0831 15:36:28.480130    3744 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0831 15:36:28.488296    3744 kubeadm.go:630] The running cluster does not require reconfiguration: 192.169.0.5
	I0831 15:36:28.488308    3744 kubeadm.go:597] duration metric: took 18.310201ms to restartPrimaryControlPlane
	I0831 15:36:28.488312    3744 kubeadm.go:394] duration metric: took 39.987749ms to StartCluster
	I0831 15:36:28.488320    3744 settings.go:142] acquiring lock: {Name:mk4b1b0a7439feab82be8f6d66b4d3c4d11c9b5f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:36:28.488392    3744 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:36:28.488767    3744 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/kubeconfig: {Name:mkc7259a3f17d77b84078e55eed4ed8b5d2486ca Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:36:28.488978    3744 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:36:28.488992    3744 start.go:241] waiting for startup goroutines ...
	I0831 15:36:28.489001    3744 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0831 15:36:28.489144    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:36:28.531040    3744 out.go:177] * Enabled addons: 
	I0831 15:36:28.551931    3744 addons.go:510] duration metric: took 62.927579ms for enable addons: enabled=[]
	I0831 15:36:28.552016    3744 start.go:246] waiting for cluster config update ...
	I0831 15:36:28.552028    3744 start.go:255] writing updated cluster config ...
	I0831 15:36:28.574130    3744 out.go:201] 
	I0831 15:36:28.595598    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:36:28.595734    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:36:28.618331    3744 out.go:177] * Starting "ha-949000-m02" control-plane node in "ha-949000" cluster
	I0831 15:36:28.659956    3744 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:36:28.659989    3744 cache.go:56] Caching tarball of preloaded images
	I0831 15:36:28.660178    3744 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:36:28.660194    3744 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:36:28.660319    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:36:28.661341    3744 start.go:360] acquireMachinesLock for ha-949000-m02: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:36:28.661436    3744 start.go:364] duration metric: took 71.648µs to acquireMachinesLock for "ha-949000-m02"
	I0831 15:36:28.661461    3744 start.go:96] Skipping create...Using existing machine configuration
	I0831 15:36:28.661470    3744 fix.go:54] fixHost starting: m02
	I0831 15:36:28.661902    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:36:28.661926    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:36:28.670964    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51824
	I0831 15:36:28.671287    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:36:28.671608    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:36:28.671619    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:36:28.671857    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:36:28.671991    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:28.672109    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:36:28.672201    3744 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:36:28.672291    3744 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 3528
	I0831 15:36:28.673213    3744 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid 3528 missing from process table
	I0831 15:36:28.673240    3744 fix.go:112] recreateIfNeeded on ha-949000-m02: state=Stopped err=<nil>
	I0831 15:36:28.673248    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	W0831 15:36:28.673335    3744 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 15:36:28.714811    3744 out.go:177] * Restarting existing hyperkit VM for "ha-949000-m02" ...
	I0831 15:36:28.736047    3744 main.go:141] libmachine: (ha-949000-m02) Calling .Start
	I0831 15:36:28.736403    3744 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:36:28.736434    3744 main.go:141] libmachine: (ha-949000-m02) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid
	I0831 15:36:28.738213    3744 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid 3528 missing from process table
	I0831 15:36:28.738226    3744 main.go:141] libmachine: (ha-949000-m02) DBG | pid 3528 is in state "Stopped"
	I0831 15:36:28.738249    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid...
	I0831 15:36:28.738619    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Using UUID 23e5d675-5201-4f3d-86b7-b25c818528d1
	I0831 15:36:28.765315    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Generated MAC 92:7:3c:3f:ee:b7
	I0831 15:36:28.765332    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:36:28.765455    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"23e5d675-5201-4f3d-86b7-b25c818528d1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c0a20)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:36:28.765495    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"23e5d675-5201-4f3d-86b7-b25c818528d1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c0a20)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:36:28.765521    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "23e5d675-5201-4f3d-86b7-b25c818528d1", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:36:28.765553    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 23e5d675-5201-4f3d-86b7-b25c818528d1 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:36:28.765562    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:36:28.767165    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 DEBUG: hyperkit: Pid is 3763
	I0831 15:36:28.767495    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 0
	I0831 15:36:28.767509    3744 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:36:28.767583    3744 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 3763
	I0831 15:36:28.769355    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:36:28.769415    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0831 15:36:28.769450    3744 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4ec63}
	I0831 15:36:28.769473    3744 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ebe4}
	I0831 15:36:28.769487    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Found match: 92:7:3c:3f:ee:b7
	I0831 15:36:28.769498    3744 main.go:141] libmachine: (ha-949000-m02) DBG | IP: 192.169.0.6
	I0831 15:36:28.769505    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetConfigRaw
	I0831 15:36:28.770167    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:36:28.770374    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:36:28.770722    3744 machine.go:93] provisionDockerMachine start ...
	I0831 15:36:28.770732    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:28.770845    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:28.770937    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:28.771045    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:28.771147    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:28.771273    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:28.771413    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:28.771572    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:36:28.771580    3744 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 15:36:28.775197    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:36:28.783845    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:36:28.784655    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:36:28.784674    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:36:28.784685    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:36:28.784693    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:36:29.168717    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:36:29.168732    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:36:29.283641    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:36:29.283661    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:36:29.283712    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:36:29.283753    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:36:29.284560    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:36:29.284571    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:36:34.866750    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:34 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0831 15:36:34.866767    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:34 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0831 15:36:34.866778    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:34 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0831 15:36:34.891499    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:34 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0831 15:36:39.840129    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 15:36:39.840143    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:36:39.840307    3744 buildroot.go:166] provisioning hostname "ha-949000-m02"
	I0831 15:36:39.840319    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:36:39.840413    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:39.840489    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:39.840578    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:39.840665    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:39.840764    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:39.840907    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:39.841055    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:36:39.841064    3744 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m02 && echo "ha-949000-m02" | sudo tee /etc/hostname
	I0831 15:36:39.913083    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m02
	
	I0831 15:36:39.913098    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:39.913252    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:39.913377    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:39.913471    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:39.913560    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:39.913685    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:39.913826    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:36:39.913837    3744 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:36:39.987034    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:36:39.987048    3744 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:36:39.987056    3744 buildroot.go:174] setting up certificates
	I0831 15:36:39.987062    3744 provision.go:84] configureAuth start
	I0831 15:36:39.987067    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:36:39.987204    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:36:39.987310    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:39.987418    3744 provision.go:143] copyHostCerts
	I0831 15:36:39.987447    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:36:39.987493    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:36:39.987499    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:36:39.988044    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:36:39.988241    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:36:39.988272    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:36:39.988277    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:36:39.988347    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:36:39.988492    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:36:39.988529    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:36:39.988533    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:36:39.988597    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:36:39.988746    3744 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m02 san=[127.0.0.1 192.169.0.6 ha-949000-m02 localhost minikube]
	I0831 15:36:40.055665    3744 provision.go:177] copyRemoteCerts
	I0831 15:36:40.055717    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:36:40.055733    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:40.055998    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:40.056098    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:40.056185    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:40.056277    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:36:40.095370    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:36:40.095446    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:36:40.115272    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:36:40.115336    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:36:40.134845    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:36:40.134920    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0831 15:36:40.154450    3744 provision.go:87] duration metric: took 167.380587ms to configureAuth
	I0831 15:36:40.154464    3744 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:36:40.154620    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:36:40.154633    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:40.154762    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:40.154852    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:40.154930    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:40.155003    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:40.155112    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:40.155216    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:40.155334    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:36:40.155341    3744 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:36:40.220781    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:36:40.220794    3744 buildroot.go:70] root file system type: tmpfs
	I0831 15:36:40.220873    3744 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:36:40.220884    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:40.221013    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:40.221103    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:40.221194    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:40.221272    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:40.221400    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:40.221546    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:36:40.221589    3744 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:36:40.298646    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:36:40.298663    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:40.298789    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:40.298885    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:40.298979    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:40.299063    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:40.299201    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:40.299341    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:36:40.299353    3744 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:36:41.956479    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:36:41.956495    3744 machine.go:96] duration metric: took 13.1856235s to provisionDockerMachine
	I0831 15:36:41.956502    3744 start.go:293] postStartSetup for "ha-949000-m02" (driver="hyperkit")
	I0831 15:36:41.956508    3744 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:36:41.956522    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:41.956703    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:36:41.956716    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:41.956812    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:41.956896    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:41.956992    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:41.957077    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:36:42.000050    3744 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:36:42.004306    3744 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:36:42.004318    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:36:42.004439    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:36:42.004572    3744 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:36:42.004578    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:36:42.004735    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:36:42.017617    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:36:42.041071    3744 start.go:296] duration metric: took 84.560659ms for postStartSetup
	I0831 15:36:42.041107    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:42.041300    3744 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0831 15:36:42.041313    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:42.041398    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:42.041504    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:42.041609    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:42.041700    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:36:42.081048    3744 machine.go:197] restoring vm config from /var/lib/minikube/backup: [etc]
	I0831 15:36:42.081113    3744 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0831 15:36:42.134445    3744 fix.go:56] duration metric: took 13.472828598s for fixHost
	I0831 15:36:42.134470    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:42.134618    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:42.134730    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:42.134822    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:42.134900    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:42.135030    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:42.135170    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:36:42.135178    3744 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:36:42.199131    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143802.088359974
	
	I0831 15:36:42.199142    3744 fix.go:216] guest clock: 1725143802.088359974
	I0831 15:36:42.199147    3744 fix.go:229] Guest: 2024-08-31 15:36:42.088359974 -0700 PDT Remote: 2024-08-31 15:36:42.13446 -0700 PDT m=+32.407831620 (delta=-46.100026ms)
	I0831 15:36:42.199164    3744 fix.go:200] guest clock delta is within tolerance: -46.100026ms
	I0831 15:36:42.199169    3744 start.go:83] releasing machines lock for "ha-949000-m02", held for 13.537577271s
	I0831 15:36:42.199184    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:42.199330    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:36:42.220967    3744 out.go:177] * Found network options:
	I0831 15:36:42.242795    3744 out.go:177]   - NO_PROXY=192.169.0.5
	W0831 15:36:42.265056    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:36:42.265093    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:42.265983    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:42.266241    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:42.266370    3744 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:36:42.266410    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	W0831 15:36:42.266454    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:36:42.266575    3744 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:36:42.266625    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:42.266633    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:42.266836    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:42.266871    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:42.267025    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:42.267062    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:42.267162    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:42.267189    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:36:42.267302    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	W0831 15:36:42.303842    3744 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:36:42.303902    3744 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:36:42.349152    3744 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:36:42.349174    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:36:42.349280    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:36:42.365129    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:36:42.373393    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:36:42.381789    3744 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:36:42.381831    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:36:42.389963    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:36:42.398325    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:36:42.406574    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:36:42.414917    3744 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:36:42.423513    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:36:42.431936    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:36:42.440352    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:36:42.449208    3744 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:36:42.457008    3744 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:36:42.464909    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:42.567905    3744 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:36:42.588297    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:36:42.588366    3744 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:36:42.602440    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:36:42.618217    3744 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:36:42.633678    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:36:42.645147    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:36:42.656120    3744 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:36:42.679235    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:36:42.690584    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:36:42.706263    3744 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:36:42.709220    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:36:42.717254    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:36:42.730693    3744 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:36:42.826051    3744 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:36:42.930594    3744 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:36:42.930623    3744 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:36:42.944719    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:43.038034    3744 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:36:45.352340    3744 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.314261795s)
	I0831 15:36:45.352402    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:36:45.362569    3744 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:36:45.374992    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:36:45.385146    3744 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:36:45.481701    3744 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:36:45.590417    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:45.703387    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:36:45.717135    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:36:45.728130    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:45.822749    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:36:45.893539    3744 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:36:45.893614    3744 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:36:45.898396    3744 start.go:563] Will wait 60s for crictl version
	I0831 15:36:45.898450    3744 ssh_runner.go:195] Run: which crictl
	I0831 15:36:45.901472    3744 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:36:45.929873    3744 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:36:45.929947    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:36:45.947410    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:36:45.987982    3744 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:36:46.029879    3744 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:36:46.051790    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:36:46.052207    3744 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:36:46.056767    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:36:46.066419    3744 mustload.go:65] Loading cluster: ha-949000
	I0831 15:36:46.066592    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:36:46.066799    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:36:46.066820    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:36:46.075457    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51846
	I0831 15:36:46.075806    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:36:46.076162    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:36:46.076180    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:36:46.076408    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:36:46.076531    3744 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:36:46.076614    3744 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:36:46.076682    3744 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 3756
	I0831 15:36:46.077630    3744 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:36:46.077872    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:36:46.077895    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:36:46.086285    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51848
	I0831 15:36:46.086630    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:36:46.086945    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:36:46.086955    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:36:46.087205    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:36:46.087313    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:46.087418    3744 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.6
	I0831 15:36:46.087426    3744 certs.go:194] generating shared ca certs ...
	I0831 15:36:46.087439    3744 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:36:46.087575    3744 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:36:46.087627    3744 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:36:46.087636    3744 certs.go:256] generating profile certs ...
	I0831 15:36:46.087739    3744 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:36:46.087826    3744 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.e26aa346
	I0831 15:36:46.087882    3744 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:36:46.087890    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:36:46.087915    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:36:46.087944    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:36:46.087962    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:36:46.087979    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:36:46.087997    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:36:46.088015    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:36:46.088032    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:36:46.088113    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:36:46.088150    3744 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:36:46.088158    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:36:46.088191    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:36:46.088226    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:36:46.088254    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:36:46.088317    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:36:46.088349    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:46.088368    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:36:46.088390    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:36:46.088420    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:46.088505    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:46.088596    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:46.088688    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:46.088763    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:36:46.117725    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0831 15:36:46.121346    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0831 15:36:46.129782    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0831 15:36:46.133012    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0831 15:36:46.141510    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0831 15:36:46.144605    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0831 15:36:46.152913    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0831 15:36:46.156010    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1679 bytes)
	I0831 15:36:46.165156    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0831 15:36:46.168250    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0831 15:36:46.176838    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0831 15:36:46.179929    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0831 15:36:46.189075    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:36:46.209492    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:36:46.229359    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:36:46.249285    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:36:46.268964    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0831 15:36:46.288566    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0831 15:36:46.308035    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:36:46.327968    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:36:46.347874    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:36:46.367538    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:36:46.387135    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:36:46.406841    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0831 15:36:46.420747    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0831 15:36:46.434267    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0831 15:36:46.447929    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1679 bytes)
	I0831 15:36:46.461487    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0831 15:36:46.475040    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0831 15:36:46.488728    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0831 15:36:46.502198    3744 ssh_runner.go:195] Run: openssl version
	I0831 15:36:46.506532    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:36:46.514857    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:36:46.518202    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:36:46.518240    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:36:46.522435    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:36:46.530730    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:36:46.538900    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:36:46.542200    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:36:46.542233    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:36:46.546382    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:36:46.554646    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:36:46.562775    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:46.566092    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:46.566127    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:46.570335    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:36:46.578778    3744 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:36:46.582068    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0831 15:36:46.586501    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0831 15:36:46.590751    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0831 15:36:46.594979    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0831 15:36:46.599120    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0831 15:36:46.603290    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0831 15:36:46.607503    3744 kubeadm.go:934] updating node {m02 192.169.0.6 8443 v1.31.0 docker true true} ...
	I0831 15:36:46.607561    3744 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.6
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:36:46.607581    3744 kube-vip.go:115] generating kube-vip config ...
	I0831 15:36:46.607619    3744 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:36:46.620005    3744 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:36:46.620042    3744 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:36:46.620097    3744 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:36:46.627507    3744 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:36:46.627555    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0831 15:36:46.634842    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:36:46.648529    3744 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:36:46.661781    3744 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:36:46.675402    3744 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:36:46.678250    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:36:46.687467    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:46.779379    3744 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:36:46.793112    3744 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:36:46.793294    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:36:46.814624    3744 out.go:177] * Verifying Kubernetes components...
	I0831 15:36:46.835323    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:46.948649    3744 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:36:46.960452    3744 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:36:46.960657    3744 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0xfc63c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:36:46.960690    3744 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:36:46.960842    3744 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m02" to be "Ready" ...
	I0831 15:36:46.960927    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:46.960932    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:46.960940    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:46.960943    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.801259    3744 round_trippers.go:574] Response Status: 200 OK in 8840 milliseconds
	I0831 15:36:55.802034    3744 node_ready.go:49] node "ha-949000-m02" has status "Ready":"True"
	I0831 15:36:55.802046    3744 node_ready.go:38] duration metric: took 8.841092254s for node "ha-949000-m02" to be "Ready" ...
	I0831 15:36:55.802051    3744 pod_ready.go:36] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:36:55.802085    3744 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I0831 15:36:55.802094    3744 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I0831 15:36:55.802131    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:36:55.802136    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.802142    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.802147    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.817181    3744 round_trippers.go:574] Response Status: 200 OK in 15 milliseconds
	I0831 15:36:55.823106    3744 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.823166    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:36:55.823172    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.823178    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.823182    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.833336    3744 round_trippers.go:574] Response Status: 200 OK in 10 milliseconds
	I0831 15:36:55.833806    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:36:55.833817    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.833824    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.833829    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.843262    3744 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0831 15:36:55.843562    3744 pod_ready.go:93] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"True"
	I0831 15:36:55.843572    3744 pod_ready.go:82] duration metric: took 20.449445ms for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.843595    3744 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.843648    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-snq8s
	I0831 15:36:55.843655    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.843662    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.843667    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.846571    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:36:55.846969    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:36:55.846976    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.846982    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.846985    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.848597    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:55.848912    3744 pod_ready.go:93] pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace has status "Ready":"True"
	I0831 15:36:55.848921    3744 pod_ready.go:82] duration metric: took 5.319208ms for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.848934    3744 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.848970    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000
	I0831 15:36:55.848975    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.848981    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.848985    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.850738    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:55.851195    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:36:55.851203    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.851209    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.851212    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.852625    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:55.853038    3744 pod_ready.go:93] pod "etcd-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:36:55.853047    3744 pod_ready.go:82] duration metric: took 4.107015ms for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.853053    3744 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.853087    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m02
	I0831 15:36:55.853092    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.853100    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.853104    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.854440    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:55.854845    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:55.854852    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.854858    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.854861    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.856182    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:55.856534    3744 pod_ready.go:93] pod "etcd-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:36:55.856542    3744 pod_ready.go:82] duration metric: took 3.483952ms for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.856548    3744 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.856578    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m03
	I0831 15:36:55.856582    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.856588    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.856592    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.858303    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:56.003107    3744 request.go:632] Waited for 144.429757ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:36:56.003176    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:36:56.003183    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:56.003189    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:56.003193    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:56.004813    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:56.005140    3744 pod_ready.go:93] pod "etcd-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:36:56.005149    3744 pod_ready.go:82] duration metric: took 148.59533ms for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:56.005160    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:56.202344    3744 request.go:632] Waited for 197.12667ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:36:56.202386    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:36:56.202417    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:56.202425    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:56.202428    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:56.205950    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:56.403821    3744 request.go:632] Waited for 197.364477ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:36:56.403986    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:36:56.403997    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:56.404008    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:56.404017    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:56.407269    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:56.407644    3744 pod_ready.go:98] node "ha-949000" hosting pod "kube-apiserver-ha-949000" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-949000" has status "Ready":"False"
	I0831 15:36:56.407658    3744 pod_ready.go:82] duration metric: took 402.487822ms for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	E0831 15:36:56.407673    3744 pod_ready.go:67] WaitExtra: waitPodCondition: node "ha-949000" hosting pod "kube-apiserver-ha-949000" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-949000" has status "Ready":"False"
	I0831 15:36:56.407681    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:56.602890    3744 request.go:632] Waited for 195.157951ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:36:56.602980    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:36:56.602991    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:56.603003    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:56.603010    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:56.606100    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:56.802222    3744 request.go:632] Waited for 195.71026ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:56.802289    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:56.802295    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:56.802301    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:56.802305    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:56.804612    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:36:56.804914    3744 pod_ready.go:93] pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:36:56.804923    3744 pod_ready.go:82] duration metric: took 397.232028ms for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:56.804930    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:57.003522    3744 request.go:632] Waited for 198.554376ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:36:57.003559    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:36:57.003600    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:57.003608    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:57.003618    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:57.005675    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:36:57.203456    3744 request.go:632] Waited for 197.402218ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:36:57.203520    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:36:57.203526    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:57.203532    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:57.203537    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:57.206124    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:36:57.206516    3744 pod_ready.go:93] pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:36:57.206526    3744 pod_ready.go:82] duration metric: took 401.586021ms for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:57.206534    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:57.402973    3744 request.go:632] Waited for 196.400032ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:36:57.403011    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:36:57.403017    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:57.403051    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:57.403056    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:57.405260    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:36:57.603636    3744 request.go:632] Waited for 197.987151ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:36:57.603708    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:36:57.603713    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:57.603719    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:57.603724    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:57.606022    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:36:57.606364    3744 pod_ready.go:98] node "ha-949000" hosting pod "kube-controller-manager-ha-949000" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-949000" has status "Ready":"False"
	I0831 15:36:57.606376    3744 pod_ready.go:82] duration metric: took 399.83214ms for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	E0831 15:36:57.606383    3744 pod_ready.go:67] WaitExtra: waitPodCondition: node "ha-949000" hosting pod "kube-controller-manager-ha-949000" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-949000" has status "Ready":"False"
	I0831 15:36:57.606388    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:57.802885    3744 request.go:632] Waited for 196.449707ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:36:57.803017    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:36:57.803028    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:57.803039    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:57.803046    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:57.806339    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:58.003449    3744 request.go:632] Waited for 196.421818ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:58.003513    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:58.003518    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:58.003524    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:58.003527    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:58.005621    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:36:58.203691    3744 request.go:632] Waited for 95.498322ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:36:58.203749    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:36:58.203758    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:58.203763    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:58.203766    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:58.207046    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:58.403784    3744 request.go:632] Waited for 196.241368ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:58.403948    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:58.403963    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:58.403974    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:58.404010    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:58.407767    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:58.608224    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:36:58.608245    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:58.608257    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:58.608265    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:58.611367    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:58.802284    3744 request.go:632] Waited for 190.220665ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:58.802382    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:58.802393    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:58.802407    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:58.802421    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:58.806173    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:59.108214    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:36:59.108238    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:59.108248    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:59.108332    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:59.111913    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:59.202533    3744 request.go:632] Waited for 89.639104ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:59.202672    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:59.202684    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:59.202693    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:59.202700    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:59.205790    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:59.608244    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:36:59.608308    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:59.608333    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:59.608346    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:59.611536    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:59.612038    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:59.612050    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:59.612056    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:59.612059    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:59.613486    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:59.613797    3744 pod_ready.go:103] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:00.108234    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:00.108258    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:00.108269    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:00.108276    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:00.112243    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:00.112803    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:00.112811    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:00.112816    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:00.112819    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:00.114922    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:00.608266    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:00.608291    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:00.608340    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:00.608348    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:00.611571    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:00.612033    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:00.612041    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:00.612047    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:00.612051    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:00.614268    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:01.108244    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:01.108270    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:01.108282    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:01.108287    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:01.112176    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:01.112688    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:01.112697    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:01.112703    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:01.112706    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:01.114756    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:01.608252    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:01.608269    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:01.608303    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:01.608308    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:01.610548    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:01.610932    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:01.610940    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:01.610946    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:01.610951    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:01.612574    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:02.108349    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:02.108375    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:02.108386    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:02.108392    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:02.111907    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:02.112645    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:02.112653    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:02.112658    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:02.112662    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:02.114143    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:02.114439    3744 pod_ready.go:103] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:02.608228    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:02.608245    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:02.608252    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:02.608256    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:02.610772    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:02.611191    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:02.611199    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:02.611206    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:02.611210    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:02.613037    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:03.108219    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:03.108235    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:03.108241    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:03.108250    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:03.111668    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:03.112196    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:03.112204    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:03.112211    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:03.112214    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:03.114279    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:03.608402    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:03.608463    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:03.608509    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:03.608524    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:03.611720    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:03.612413    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:03.612424    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:03.612432    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:03.612436    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:03.615410    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:04.108309    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:04.108328    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:04.108337    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:04.108341    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:04.115334    3744 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:37:04.115796    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:04.115804    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:04.115815    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:04.115818    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:04.122611    3744 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:37:04.122876    3744 pod_ready.go:103] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:04.608750    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:04.608825    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:04.608840    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:04.608846    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:04.612925    3744 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:37:04.613492    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:04.613499    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:04.613505    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:04.613509    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:04.614977    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:05.106817    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:05.106842    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:05.106852    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:05.106859    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:05.110466    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:05.111095    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:05.111106    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:05.111113    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:05.111117    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:05.112615    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:05.608187    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:05.608211    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:05.608224    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:05.608248    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:05.611732    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:05.612260    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:05.612270    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:05.612278    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:05.612284    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:05.614120    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:06.107506    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:06.107527    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:06.107540    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:06.107545    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:06.110547    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:06.111218    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:06.111229    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:06.111237    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:06.111242    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:06.112971    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:06.607368    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:06.607380    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:06.607386    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:06.607391    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:06.609787    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:06.610207    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:06.610215    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:06.610221    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:06.610224    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:06.611989    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:06.612289    3744 pod_ready.go:103] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:07.107726    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:07.107744    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:07.107773    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:07.107777    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:07.109482    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:07.109930    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:07.109937    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:07.109943    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:07.109947    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:07.111448    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:07.607689    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:07.607742    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:07.607753    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:07.607759    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:07.610882    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:07.611345    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:07.611353    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:07.611359    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:07.611369    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:07.613392    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:08.107409    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:08.107435    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:08.107446    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:08.107451    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:08.111199    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:08.111808    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:08.111815    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:08.111820    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:08.111825    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:08.113569    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:08.607450    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:08.607477    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:08.607489    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:08.607494    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:08.611034    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:08.611547    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:08.611557    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:08.611563    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:08.611568    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:08.613347    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:08.613756    3744 pod_ready.go:103] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:09.108698    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:09.108730    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:09.108778    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:09.108791    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:09.112115    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:09.112783    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:09.112791    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:09.112796    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:09.112803    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:09.114417    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:09.606780    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:09.606804    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:09.606816    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:09.606824    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:09.609915    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:09.610481    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:09.610488    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:09.610494    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:09.610497    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:09.612172    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:10.106727    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:10.106745    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:10.106779    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:10.106786    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:10.109423    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:10.109937    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:10.109944    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:10.109950    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:10.109953    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:10.111717    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:10.607619    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:10.607642    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:10.607653    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:10.607658    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:10.610928    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:10.611460    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:10.611467    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:10.611472    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:10.611475    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:10.613024    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:11.108825    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:11.108848    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:11.108859    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:11.108865    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:11.112708    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:11.113184    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:11.113195    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:11.113202    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:11.113207    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:11.115187    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:11.116261    3744 pod_ready.go:103] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:11.607215    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:11.607243    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:11.607254    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:11.607261    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:11.611037    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:11.611547    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:11.611557    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:11.611565    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:11.611569    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:11.613373    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.108739    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:12.108764    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.108774    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.108779    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.112484    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:12.113117    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:12.113125    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.113131    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.113135    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.114878    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.608099    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:12.608124    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.608133    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.608140    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.611866    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:12.612563    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:12.612571    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.612577    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.612581    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.614297    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.614794    3744 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:12.614803    3744 pod_ready.go:82] duration metric: took 15.008248116s for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.614810    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.614849    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:37:12.614854    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.614860    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.614864    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.617726    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:12.618084    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:12.618092    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.618097    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.618100    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.619622    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.620160    3744 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:12.620169    3744 pod_ready.go:82] duration metric: took 5.352553ms for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.620175    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.620212    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:37:12.620217    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.620222    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.620225    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.624634    3744 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:37:12.625059    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:12.625066    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.625071    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.625074    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.626559    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.626901    3744 pod_ready.go:93] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:12.626910    3744 pod_ready.go:82] duration metric: took 6.729281ms for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.626916    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.626951    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-d45q5
	I0831 15:37:12.626956    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.626961    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.626964    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.628480    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.628945    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:12.628956    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.628961    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.628965    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.630425    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.630760    3744 pod_ready.go:93] pod "kube-proxy-d45q5" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:12.630769    3744 pod_ready.go:82] duration metric: took 3.847336ms for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.630775    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.630807    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:12.630812    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.630817    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.630821    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.632536    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.633060    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:12.633067    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.633072    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.633077    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.634424    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:13.132549    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:13.132573    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:13.132585    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:13.132591    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:13.135680    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:13.136120    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:13.136128    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:13.136133    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:13.136137    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:13.137931    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:13.632454    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:13.632468    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:13.632474    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:13.632477    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:13.634478    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:13.634979    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:13.634987    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:13.634992    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:13.634997    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:13.636493    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:14.132750    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:14.132776    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:14.132788    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:14.132794    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:14.136342    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:14.136985    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:14.136993    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:14.136999    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:14.137002    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:14.139021    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:14.630998    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:14.631010    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:14.631017    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:14.631019    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:14.637296    3744 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:37:14.637754    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:14.637761    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:14.637767    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:14.637770    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:14.645976    3744 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0831 15:37:14.646303    3744 pod_ready.go:103] pod "kube-proxy-q7ndn" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:15.131375    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:15.131389    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:15.131395    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:15.131398    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:15.136989    3744 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0831 15:37:15.137543    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:15.137552    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:15.137557    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:15.137561    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:15.145480    3744 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0831 15:37:15.631037    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:15.631049    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:15.631056    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:15.631060    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:15.650939    3744 round_trippers.go:574] Response Status: 200 OK in 19 milliseconds
	I0831 15:37:15.657344    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:15.657354    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:15.657360    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:15.657363    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:15.664319    3744 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:37:16.131044    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:16.131056    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:16.131062    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:16.131065    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:16.133359    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:16.133806    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:16.133815    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:16.133821    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:16.133835    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:16.135405    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:16.631836    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:16.631848    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:16.631854    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:16.631858    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:16.633942    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:16.634428    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:16.634436    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:16.634442    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:16.634449    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:16.636230    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:17.131746    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:17.131800    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.131814    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.131820    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.135452    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:17.136132    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:17.136139    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.136145    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.136148    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.137779    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:17.138135    3744 pod_ready.go:93] pod "kube-proxy-q7ndn" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:17.138143    3744 pod_ready.go:82] duration metric: took 4.507315671s for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:17.138150    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:17.138183    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:37:17.138187    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.138193    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.138198    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.140005    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:17.140372    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:17.140380    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.140385    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.140388    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.142052    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:17.142371    3744 pod_ready.go:93] pod "kube-scheduler-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:17.142380    3744 pod_ready.go:82] duration metric: took 4.22523ms for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:17.142387    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:17.142420    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:37:17.142425    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.142430    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.142433    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.144162    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:17.144573    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:17.144580    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.144585    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.144591    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.146052    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:17.146407    3744 pod_ready.go:93] pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:17.146415    3744 pod_ready.go:82] duration metric: took 4.022752ms for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:17.146422    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:17.208351    3744 request.go:632] Waited for 61.893937ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:37:17.208418    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:37:17.208435    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.208444    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.208449    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.211070    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:17.408566    3744 request.go:632] Waited for 197.051034ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:17.408606    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:17.408614    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.408622    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.408627    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.410767    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:17.411178    3744 pod_ready.go:93] pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:17.411187    3744 pod_ready.go:82] duration metric: took 264.75731ms for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:17.411194    3744 pod_ready.go:39] duration metric: took 21.608904421s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:37:17.411208    3744 api_server.go:52] waiting for apiserver process to appear ...
	I0831 15:37:17.411260    3744 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:37:17.423683    3744 api_server.go:72] duration metric: took 30.630215512s to wait for apiserver process to appear ...
	I0831 15:37:17.423694    3744 api_server.go:88] waiting for apiserver healthz status ...
	I0831 15:37:17.423707    3744 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0831 15:37:17.427947    3744 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0831 15:37:17.427987    3744 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0831 15:37:17.427992    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.427998    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.428008    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.428562    3744 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 15:37:17.428682    3744 api_server.go:141] control plane version: v1.31.0
	I0831 15:37:17.428691    3744 api_server.go:131] duration metric: took 4.99355ms to wait for apiserver health ...
	I0831 15:37:17.428699    3744 system_pods.go:43] waiting for kube-system pods to appear ...
	I0831 15:37:17.609319    3744 request.go:632] Waited for 180.546017ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:37:17.609356    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:37:17.609364    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.609372    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.609378    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.615729    3744 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:37:17.620529    3744 system_pods.go:59] 24 kube-system pods found
	I0831 15:37:17.620549    3744 system_pods.go:61] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0831 15:37:17.620557    3744 system_pods.go:61] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0831 15:37:17.620562    3744 system_pods.go:61] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:37:17.620566    3744 system_pods.go:61] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:37:17.620569    3744 system_pods.go:61] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:37:17.620572    3744 system_pods.go:61] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:37:17.620577    3744 system_pods.go:61] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:37:17.620581    3744 system_pods.go:61] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:37:17.620583    3744 system_pods.go:61] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:37:17.620586    3744 system_pods.go:61] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:37:17.620588    3744 system_pods.go:61] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:37:17.620593    3744 system_pods.go:61] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I0831 15:37:17.620596    3744 system_pods.go:61] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:37:17.620599    3744 system_pods.go:61] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:37:17.620602    3744 system_pods.go:61] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:37:17.620605    3744 system_pods.go:61] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:37:17.620607    3744 system_pods.go:61] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:37:17.620610    3744 system_pods.go:61] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:37:17.620612    3744 system_pods.go:61] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:37:17.620615    3744 system_pods.go:61] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:37:17.620617    3744 system_pods.go:61] "kube-vip-ha-949000" [98967a2c-6641-4193-b7ce-c0fbdee58344] Running
	I0831 15:37:17.620620    3744 system_pods.go:61] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:37:17.620622    3744 system_pods.go:61] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:37:17.620625    3744 system_pods.go:61] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:37:17.620628    3744 system_pods.go:74] duration metric: took 191.923916ms to wait for pod list to return data ...
	I0831 15:37:17.620634    3744 default_sa.go:34] waiting for default service account to be created ...
	I0831 15:37:17.808285    3744 request.go:632] Waited for 187.597884ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:37:17.808399    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:37:17.808411    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.808422    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.808429    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.812254    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:17.812385    3744 default_sa.go:45] found service account: "default"
	I0831 15:37:17.812394    3744 default_sa.go:55] duration metric: took 191.75371ms for default service account to be created ...
	I0831 15:37:17.812410    3744 system_pods.go:116] waiting for k8s-apps to be running ...
	I0831 15:37:18.009398    3744 request.go:632] Waited for 196.900555ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:37:18.009462    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:37:18.009503    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:18.009518    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:18.009526    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:18.017075    3744 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0831 15:37:18.022069    3744 system_pods.go:86] 24 kube-system pods found
	I0831 15:37:18.022087    3744 system_pods.go:89] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0831 15:37:18.022093    3744 system_pods.go:89] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0831 15:37:18.022097    3744 system_pods.go:89] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:37:18.022101    3744 system_pods.go:89] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:37:18.022105    3744 system_pods.go:89] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:37:18.022108    3744 system_pods.go:89] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:37:18.022111    3744 system_pods.go:89] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:37:18.022114    3744 system_pods.go:89] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:37:18.022117    3744 system_pods.go:89] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:37:18.022120    3744 system_pods.go:89] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:37:18.022123    3744 system_pods.go:89] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:37:18.022127    3744 system_pods.go:89] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I0831 15:37:18.022131    3744 system_pods.go:89] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:37:18.022134    3744 system_pods.go:89] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:37:18.022138    3744 system_pods.go:89] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:37:18.022140    3744 system_pods.go:89] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:37:18.022143    3744 system_pods.go:89] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:37:18.022146    3744 system_pods.go:89] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:37:18.022148    3744 system_pods.go:89] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:37:18.022152    3744 system_pods.go:89] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:37:18.022155    3744 system_pods.go:89] "kube-vip-ha-949000" [98967a2c-6641-4193-b7ce-c0fbdee58344] Running
	I0831 15:37:18.022157    3744 system_pods.go:89] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:37:18.022160    3744 system_pods.go:89] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:37:18.022162    3744 system_pods.go:89] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:37:18.022168    3744 system_pods.go:126] duration metric: took 209.74863ms to wait for k8s-apps to be running ...
	I0831 15:37:18.022173    3744 system_svc.go:44] waiting for kubelet service to be running ....
	I0831 15:37:18.022230    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:37:18.033610    3744 system_svc.go:56] duration metric: took 11.428501ms WaitForService to wait for kubelet
	I0831 15:37:18.033632    3744 kubeadm.go:582] duration metric: took 31.24015665s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:37:18.033647    3744 node_conditions.go:102] verifying NodePressure condition ...
	I0831 15:37:18.208845    3744 request.go:632] Waited for 175.149396ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0831 15:37:18.208908    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0831 15:37:18.208914    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:18.208921    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:18.208926    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:18.213884    3744 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:37:18.214480    3744 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:37:18.214495    3744 node_conditions.go:123] node cpu capacity is 2
	I0831 15:37:18.214504    3744 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:37:18.214507    3744 node_conditions.go:123] node cpu capacity is 2
	I0831 15:37:18.214510    3744 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:37:18.214513    3744 node_conditions.go:123] node cpu capacity is 2
	I0831 15:37:18.214516    3744 node_conditions.go:105] duration metric: took 180.864612ms to run NodePressure ...
	I0831 15:37:18.214525    3744 start.go:241] waiting for startup goroutines ...
	I0831 15:37:18.214542    3744 start.go:255] writing updated cluster config ...
	I0831 15:37:18.235038    3744 out.go:201] 
	I0831 15:37:18.272074    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:37:18.272141    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:37:18.293920    3744 out.go:177] * Starting "ha-949000-m03" control-plane node in "ha-949000" cluster
	I0831 15:37:18.336055    3744 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:37:18.336091    3744 cache.go:56] Caching tarball of preloaded images
	I0831 15:37:18.336291    3744 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:37:18.336317    3744 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:37:18.336472    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:37:18.337744    3744 start.go:360] acquireMachinesLock for ha-949000-m03: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:37:18.337863    3744 start.go:364] duration metric: took 91.481µs to acquireMachinesLock for "ha-949000-m03"
	I0831 15:37:18.337896    3744 start.go:96] Skipping create...Using existing machine configuration
	I0831 15:37:18.337907    3744 fix.go:54] fixHost starting: m03
	I0831 15:37:18.338304    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:37:18.338331    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:37:18.347585    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51853
	I0831 15:37:18.347933    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:37:18.348309    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:37:18.348325    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:37:18.348554    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:37:18.348680    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:18.348764    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetState
	I0831 15:37:18.348835    3744 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:37:18.348927    3744 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:37:18.349821    3744 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid 3227 missing from process table
	I0831 15:37:18.349851    3744 fix.go:112] recreateIfNeeded on ha-949000-m03: state=Stopped err=<nil>
	I0831 15:37:18.349859    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	W0831 15:37:18.349928    3744 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 15:37:18.371074    3744 out.go:177] * Restarting existing hyperkit VM for "ha-949000-m03" ...
	I0831 15:37:18.413086    3744 main.go:141] libmachine: (ha-949000-m03) Calling .Start
	I0831 15:37:18.413447    3744 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:37:18.413507    3744 main.go:141] libmachine: (ha-949000-m03) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid
	I0831 15:37:18.415280    3744 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid 3227 missing from process table
	I0831 15:37:18.415294    3744 main.go:141] libmachine: (ha-949000-m03) DBG | pid 3227 is in state "Stopped"
	I0831 15:37:18.415313    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid...
	I0831 15:37:18.415660    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Using UUID 3fdefe95-7552-4d5b-8412-6ae6e5c787bb
	I0831 15:37:18.441752    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Generated MAC fa:59:9e:3b:35:6d
	I0831 15:37:18.441781    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:37:18.441964    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"3fdefe95-7552-4d5b-8412-6ae6e5c787bb", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00037b4a0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:37:18.442001    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"3fdefe95-7552-4d5b-8412-6ae6e5c787bb", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00037b4a0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:37:18.442067    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "3fdefe95-7552-4d5b-8412-6ae6e5c787bb", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/ha-949000-m03.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:37:18.442136    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 3fdefe95-7552-4d5b-8412-6ae6e5c787bb -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/ha-949000-m03.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:37:18.442155    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:37:18.443921    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 DEBUG: hyperkit: Pid is 3783
	I0831 15:37:18.444292    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 0
	I0831 15:37:18.444304    3744 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:37:18.444362    3744 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3783
	I0831 15:37:18.446124    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:37:18.446228    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0831 15:37:18.446248    3744 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ec75}
	I0831 15:37:18.446260    3744 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4ec63}
	I0831 15:37:18.446272    3744 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d4eb85}
	I0831 15:37:18.446306    3744 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d4eb32}
	I0831 15:37:18.446321    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Found match: fa:59:9e:3b:35:6d
	I0831 15:37:18.446335    3744 main.go:141] libmachine: (ha-949000-m03) DBG | IP: 192.169.0.7
	I0831 15:37:18.446363    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetConfigRaw
	I0831 15:37:18.447082    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:37:18.447293    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:37:18.447693    3744 machine.go:93] provisionDockerMachine start ...
	I0831 15:37:18.447703    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:18.447827    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:18.447958    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:18.448072    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:18.448161    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:18.448250    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:18.448355    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:37:18.448517    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:37:18.448526    3744 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 15:37:18.451810    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:37:18.461189    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:37:18.462060    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:37:18.462072    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:37:18.462081    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:37:18.462086    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:37:18.852728    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:37:18.852743    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:37:18.968113    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:37:18.968132    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:37:18.968140    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:37:18.968171    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:37:18.968968    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:37:18.968978    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:37:24.540624    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:24 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0831 15:37:24.540682    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:24 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0831 15:37:24.540695    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:24 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0831 15:37:24.565460    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:24 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0831 15:37:29.520863    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 15:37:29.520878    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:37:29.521004    3744 buildroot.go:166] provisioning hostname "ha-949000-m03"
	I0831 15:37:29.521015    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:37:29.521111    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:29.521203    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:29.521290    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.521386    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.521482    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:29.521612    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:37:29.521765    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:37:29.521776    3744 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m03 && echo "ha-949000-m03" | sudo tee /etc/hostname
	I0831 15:37:29.591531    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m03
	
	I0831 15:37:29.591551    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:29.591708    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:29.591803    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.591884    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.591995    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:29.592173    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:37:29.592330    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:37:29.592341    3744 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m03' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m03/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m03' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:37:29.658685    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:37:29.658701    3744 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:37:29.658714    3744 buildroot.go:174] setting up certificates
	I0831 15:37:29.658720    3744 provision.go:84] configureAuth start
	I0831 15:37:29.658727    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:37:29.658867    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:37:29.658966    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:29.659054    3744 provision.go:143] copyHostCerts
	I0831 15:37:29.659089    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:37:29.659140    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:37:29.659146    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:37:29.659263    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:37:29.659455    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:37:29.659484    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:37:29.659488    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:37:29.659564    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:37:29.659714    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:37:29.659747    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:37:29.659753    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:37:29.659818    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:37:29.659964    3744 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m03 san=[127.0.0.1 192.169.0.7 ha-949000-m03 localhost minikube]
	I0831 15:37:29.736089    3744 provision.go:177] copyRemoteCerts
	I0831 15:37:29.736163    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:37:29.736179    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:29.736322    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:29.736416    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.736504    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:29.736597    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:37:29.771590    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:37:29.771658    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:37:29.791254    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:37:29.791326    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:37:29.810923    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:37:29.810991    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0831 15:37:29.830631    3744 provision.go:87] duration metric: took 171.900577ms to configureAuth
	I0831 15:37:29.830645    3744 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:37:29.830811    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:37:29.830824    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:29.830954    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:29.831042    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:29.831126    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.831207    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.831289    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:29.831399    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:37:29.831522    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:37:29.831530    3744 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:37:29.892205    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:37:29.892217    3744 buildroot.go:70] root file system type: tmpfs
	I0831 15:37:29.892291    3744 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:37:29.892302    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:29.892426    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:29.892516    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.892609    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.892714    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:29.892838    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:37:29.892976    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:37:29.893022    3744 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:37:29.961258    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:37:29.961276    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:29.961414    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:29.961511    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.961619    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.961703    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:29.961817    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:37:29.961955    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:37:29.961967    3744 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:37:31.615783    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:37:31.615799    3744 machine.go:96] duration metric: took 13.167957184s to provisionDockerMachine
	I0831 15:37:31.615806    3744 start.go:293] postStartSetup for "ha-949000-m03" (driver="hyperkit")
	I0831 15:37:31.615814    3744 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:37:31.615823    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:31.616028    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:37:31.616046    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:31.616158    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:31.616258    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:31.616349    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:31.616481    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:37:31.654537    3744 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:37:31.657860    3744 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:37:31.657873    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:37:31.657960    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:37:31.658093    3744 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:37:31.658099    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:37:31.658258    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:37:31.672215    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:37:31.694606    3744 start.go:296] duration metric: took 78.79067ms for postStartSetup
	I0831 15:37:31.694628    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:31.694811    3744 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0831 15:37:31.694825    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:31.694916    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:31.695011    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:31.695099    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:31.695179    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:37:31.731833    3744 machine.go:197] restoring vm config from /var/lib/minikube/backup: [etc]
	I0831 15:37:31.731896    3744 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0831 15:37:31.763292    3744 fix.go:56] duration metric: took 13.425238964s for fixHost
	I0831 15:37:31.763317    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:31.763450    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:31.763540    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:31.763638    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:31.763730    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:31.763846    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:37:31.764005    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:37:31.764012    3744 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:37:31.823707    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143851.888011101
	
	I0831 15:37:31.823721    3744 fix.go:216] guest clock: 1725143851.888011101
	I0831 15:37:31.823727    3744 fix.go:229] Guest: 2024-08-31 15:37:31.888011101 -0700 PDT Remote: 2024-08-31 15:37:31.763307 -0700 PDT m=+82.036146513 (delta=124.704101ms)
	I0831 15:37:31.823737    3744 fix.go:200] guest clock delta is within tolerance: 124.704101ms
	I0831 15:37:31.823741    3744 start.go:83] releasing machines lock for "ha-949000-m03", held for 13.485720355s
	I0831 15:37:31.823765    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:31.823906    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:37:31.845130    3744 out.go:177] * Found network options:
	I0831 15:37:31.865299    3744 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6
	W0831 15:37:31.886126    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:37:31.886160    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:37:31.886178    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:31.886943    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:31.887142    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:31.887254    3744 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:37:31.887286    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	W0831 15:37:31.887368    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:37:31.887394    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:37:31.887504    3744 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:37:31.887511    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:31.887521    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:31.887696    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:31.887743    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:31.887910    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:31.887987    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:31.888104    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:31.888248    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:37:31.888351    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	W0831 15:37:31.921752    3744 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:37:31.921817    3744 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:37:31.966799    3744 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:37:31.966823    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:37:31.966938    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:37:31.983482    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:37:31.992712    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:37:32.002010    3744 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:37:32.002056    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:37:32.011011    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:37:32.020061    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:37:32.028982    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:37:32.038569    3744 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:37:32.048027    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:37:32.057745    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:37:32.066832    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:37:32.075930    3744 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:37:32.084234    3744 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:37:32.092513    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:37:32.200002    3744 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:37:32.218717    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:37:32.218782    3744 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:37:32.234470    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:37:32.246859    3744 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:37:32.268072    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:37:32.279723    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:37:32.291270    3744 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:37:32.313992    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:37:32.325465    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:37:32.340891    3744 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:37:32.343755    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:37:32.351807    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:37:32.365348    3744 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:37:32.460495    3744 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:37:32.562594    3744 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:37:32.562619    3744 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:37:32.576763    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:37:32.677110    3744 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:37:34.994745    3744 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.317591857s)
	I0831 15:37:34.994823    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:37:35.005392    3744 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:37:35.018138    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:37:35.028648    3744 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:37:35.124983    3744 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:37:35.235732    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:37:35.346302    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:37:35.360082    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:37:35.370959    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:37:35.477096    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:37:35.544102    3744 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:37:35.544184    3744 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:37:35.548776    3744 start.go:563] Will wait 60s for crictl version
	I0831 15:37:35.548834    3744 ssh_runner.go:195] Run: which crictl
	I0831 15:37:35.551795    3744 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:37:35.578659    3744 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:37:35.578734    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:37:35.596206    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:37:35.640045    3744 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:37:35.682013    3744 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:37:35.703018    3744 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0831 15:37:35.723860    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:37:35.724174    3744 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:37:35.728476    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:37:35.738147    3744 mustload.go:65] Loading cluster: ha-949000
	I0831 15:37:35.738335    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:37:35.738551    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:37:35.738572    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:37:35.747642    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51875
	I0831 15:37:35.747990    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:37:35.748302    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:37:35.748315    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:37:35.748544    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:37:35.748655    3744 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:37:35.748733    3744 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:37:35.748808    3744 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 3756
	I0831 15:37:35.749749    3744 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:37:35.749998    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:37:35.750023    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:37:35.758673    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51877
	I0831 15:37:35.758994    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:37:35.759349    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:37:35.759365    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:37:35.759557    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:37:35.759653    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:37:35.759755    3744 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.7
	I0831 15:37:35.759761    3744 certs.go:194] generating shared ca certs ...
	I0831 15:37:35.759770    3744 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:37:35.759913    3744 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:37:35.759965    3744 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:37:35.759974    3744 certs.go:256] generating profile certs ...
	I0831 15:37:35.760073    3744 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:37:35.760161    3744 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.0c0868f3
	I0831 15:37:35.760221    3744 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:37:35.760228    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:37:35.760249    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:37:35.760273    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:37:35.760292    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:37:35.760308    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:37:35.760333    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:37:35.760352    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:37:35.760368    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:37:35.760450    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:37:35.760489    3744 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:37:35.760497    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:37:35.760534    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:37:35.760565    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:37:35.760594    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:37:35.760658    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:37:35.760694    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:37:35.760715    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:37:35.760733    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:37:35.760757    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:37:35.760839    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:37:35.760910    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:37:35.761012    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:37:35.761091    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:37:35.789354    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0831 15:37:35.793275    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0831 15:37:35.801794    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0831 15:37:35.805175    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0831 15:37:35.813194    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0831 15:37:35.816357    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0831 15:37:35.824019    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0831 15:37:35.827176    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1679 bytes)
	I0831 15:37:35.835398    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0831 15:37:35.838546    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0831 15:37:35.847890    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0831 15:37:35.851045    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0831 15:37:35.858866    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:37:35.879287    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:37:35.899441    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:37:35.919810    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:37:35.940109    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0831 15:37:35.960051    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0831 15:37:35.979638    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:37:35.999504    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:37:36.019089    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:37:36.039173    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:37:36.058828    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:37:36.078456    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0831 15:37:36.092789    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0831 15:37:36.106379    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0831 15:37:36.119946    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1679 bytes)
	I0831 15:37:36.133839    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0831 15:37:36.148101    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0831 15:37:36.161739    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0831 15:37:36.175159    3744 ssh_runner.go:195] Run: openssl version
	I0831 15:37:36.179390    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:37:36.187703    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:37:36.191071    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:37:36.191114    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:37:36.195292    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:37:36.203552    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:37:36.212239    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:37:36.215746    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:37:36.215790    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:37:36.219988    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:37:36.228608    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:37:36.237421    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:37:36.240805    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:37:36.240843    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:37:36.245119    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:37:36.253604    3744 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:37:36.256982    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0831 15:37:36.261329    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0831 15:37:36.265579    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0831 15:37:36.269756    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0831 15:37:36.273922    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0831 15:37:36.278236    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0831 15:37:36.282870    3744 kubeadm.go:934] updating node {m03 192.169.0.7 8443 v1.31.0 docker true true} ...
	I0831 15:37:36.282943    3744 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m03 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.7
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:37:36.282961    3744 kube-vip.go:115] generating kube-vip config ...
	I0831 15:37:36.283008    3744 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:37:36.296221    3744 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:37:36.296272    3744 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:37:36.296330    3744 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:37:36.304482    3744 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:37:36.304539    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0831 15:37:36.311975    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:37:36.325288    3744 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:37:36.338951    3744 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:37:36.352501    3744 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:37:36.355411    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:37:36.364926    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:37:36.456418    3744 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:37:36.471558    3744 start.go:235] Will wait 6m0s for node &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:37:36.471752    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:37:36.529525    3744 out.go:177] * Verifying Kubernetes components...
	I0831 15:37:36.550389    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:37:36.691381    3744 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:37:36.709538    3744 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:37:36.709731    3744 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0xfc63c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:37:36.709775    3744 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:37:36.709942    3744 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m03" to be "Ready" ...
	I0831 15:37:36.709989    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:36.709994    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:36.710000    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:36.710003    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:36.712128    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:36.712576    3744 node_ready.go:49] node "ha-949000-m03" has status "Ready":"True"
	I0831 15:37:36.712585    3744 node_ready.go:38] duration metric: took 2.63459ms for node "ha-949000-m03" to be "Ready" ...
	I0831 15:37:36.712591    3744 pod_ready.go:36] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:37:36.712631    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:37:36.712638    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:36.712643    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:36.712650    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:36.716253    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:36.722917    3744 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:36.722974    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:36.722980    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:36.722986    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:36.722989    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:36.725559    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:36.726201    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:36.726209    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:36.726215    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:36.726231    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:36.728257    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:37.223697    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:37.223717    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:37.223728    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:37.223737    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:37.235316    3744 round_trippers.go:574] Response Status: 200 OK in 11 milliseconds
	I0831 15:37:37.236200    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:37.236213    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:37.236221    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:37.236224    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:37.238445    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:37.723177    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:37.723191    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:37.723198    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:37.723201    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:37.730411    3744 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0831 15:37:37.731034    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:37.731043    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:37.731048    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:37.731053    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:37.733549    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:38.223151    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:38.223168    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:38.223174    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:38.223177    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:38.225984    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:38.226378    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:38.226386    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:38.226391    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:38.226394    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:38.229300    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:38.724309    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:38.724325    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:38.724334    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:38.724337    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:38.726908    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:38.727435    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:38.727443    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:38.727449    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:38.727454    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:38.729651    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:38.730063    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:39.223582    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:39.223601    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:39.223608    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:39.223627    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:39.225990    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:39.226495    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:39.226503    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:39.226509    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:39.226514    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:39.228583    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:39.724043    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:39.724057    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:39.724068    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:39.724079    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:39.726325    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:39.726730    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:39.726738    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:39.726744    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:39.726748    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:39.728764    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:40.223977    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:40.223993    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:40.224000    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:40.224004    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:40.226279    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:40.226700    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:40.226708    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:40.226714    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:40.226718    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:40.228516    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:40.724602    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:40.724619    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:40.724628    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:40.724634    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:40.727418    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:40.727959    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:40.727966    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:40.727972    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:40.727983    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:40.729907    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:40.730276    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:41.223101    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:41.223117    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:41.223124    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:41.223128    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:41.225118    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:41.225750    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:41.225761    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:41.225768    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:41.225772    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:41.227757    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:41.724913    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:41.724940    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:41.724951    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:41.725035    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:41.728761    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:41.729240    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:41.729247    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:41.729252    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:41.729255    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:41.730912    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:42.224964    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:42.224989    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:42.225001    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:42.225006    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:42.228620    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:42.229196    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:42.229204    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:42.229210    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:42.229214    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:42.232307    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:42.725079    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:42.725106    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:42.725118    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:42.725128    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:42.728799    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:42.729409    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:42.729420    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:42.729429    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:42.729435    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:42.731172    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:42.731531    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:43.225019    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:43.225047    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:43.225060    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:43.225067    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:43.228808    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:43.229389    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:43.229399    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:43.229405    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:43.229409    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:43.231056    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:43.724985    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:43.725000    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:43.725006    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:43.725010    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:43.727056    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:43.727478    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:43.727485    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:43.727491    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:43.727494    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:43.729068    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:44.224095    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:44.224121    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:44.224133    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:44.224181    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:44.227349    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:44.228120    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:44.228128    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:44.228134    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:44.228138    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:44.229966    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:44.725021    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:44.725045    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:44.725058    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:44.725062    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:44.729238    3744 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:37:44.729727    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:44.729735    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:44.729741    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:44.729745    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:44.731433    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:44.731726    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:45.225302    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:45.225330    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:45.225341    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:45.225347    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:45.228863    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:45.229379    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:45.229389    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:45.229397    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:45.229401    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:45.231429    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:45.724243    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:45.724324    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:45.724337    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:45.724344    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:45.727683    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:45.728405    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:45.728413    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:45.728419    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:45.728422    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:45.730098    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:46.223716    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:46.223773    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:46.223788    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:46.223796    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:46.227605    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:46.228067    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:46.228076    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:46.228082    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:46.228085    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:46.229768    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:46.724565    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:46.724619    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:46.724633    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:46.724641    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:46.728150    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:46.728985    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:46.728992    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:46.728998    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:46.729001    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:46.730855    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:47.224578    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:47.224599    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:47.224612    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:47.224618    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:47.227578    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:47.228002    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:47.228009    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:47.228015    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:47.228018    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:47.229721    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:47.230041    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:47.724560    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:47.724585    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:47.724594    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:47.724599    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:47.728122    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:47.728734    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:47.728742    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:47.728748    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:47.728751    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:47.730435    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:48.223615    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:48.223629    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:48.223636    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:48.223640    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:48.226095    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:48.226577    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:48.226586    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:48.226591    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:48.226598    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:48.228415    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:48.724122    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:48.724142    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:48.724153    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:48.724160    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:48.727651    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:48.728172    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:48.728183    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:48.728191    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:48.728195    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:48.729902    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:49.223260    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:49.223281    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:49.223292    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:49.223298    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:49.226301    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:49.226932    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:49.226940    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:49.226945    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:49.226947    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:49.228480    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:49.724076    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:49.724109    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:49.724120    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:49.724127    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:49.727544    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:49.728275    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:49.728284    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:49.728290    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:49.728293    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:49.729994    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:49.730332    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:50.223448    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:50.223462    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:50.223471    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:50.223475    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:50.225685    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:50.226217    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:50.226225    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:50.226231    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:50.226242    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:50.228286    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:50.723871    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:50.723896    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:50.723910    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:50.723918    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:50.727053    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:50.728013    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:50.728021    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:50.728027    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:50.728033    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:50.729924    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:51.223394    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:51.223411    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:51.223419    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:51.223424    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:51.226019    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:51.226638    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:51.226646    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:51.226652    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:51.226662    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:51.228242    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:51.724305    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:51.724331    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:51.724341    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:51.724348    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:51.728121    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:51.728579    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:51.728588    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:51.728593    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:51.728603    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:51.730578    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:51.730868    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:52.223952    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:52.224012    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:52.224021    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:52.224025    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:52.226458    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:52.227072    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:52.227080    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:52.227087    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:52.227090    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:52.228719    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:52.724240    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:52.724287    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:52.724299    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:52.724308    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:52.727394    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:52.727827    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:52.727834    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:52.727840    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:52.727844    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:52.729417    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:53.224920    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:53.225020    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:53.225037    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:53.225045    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:53.228826    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:53.229364    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:53.229374    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:53.229380    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:53.229387    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:53.231081    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:53.723365    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:53.723381    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:53.723393    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:53.723397    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:53.725512    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:53.725934    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:53.725942    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:53.725948    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:53.725951    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:53.727517    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:54.223251    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:54.223290    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.223310    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.223318    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.225362    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.225778    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:54.225786    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.225792    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.225797    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.227316    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:54.227664    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:54.723470    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:54.723553    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.723566    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.723572    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.726339    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.727040    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:54.727047    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.727053    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.727056    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.729195    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.729717    3744 pod_ready.go:93] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:54.729726    3744 pod_ready.go:82] duration metric: took 18.006599646s for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.729733    3744 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.729768    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-snq8s
	I0831 15:37:54.729773    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.729779    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.729782    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.731747    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:54.732348    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:54.732355    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.732364    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.732369    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.734207    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:54.734716    3744 pod_ready.go:93] pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:54.734725    3744 pod_ready.go:82] duration metric: took 4.986587ms for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.734738    3744 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.734775    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000
	I0831 15:37:54.734780    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.734785    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.734789    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.736900    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.737556    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:54.737563    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.737569    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.737573    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.739693    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.740047    3744 pod_ready.go:93] pod "etcd-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:54.740059    3744 pod_ready.go:82] duration metric: took 5.312281ms for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.740065    3744 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.740098    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m02
	I0831 15:37:54.740102    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.740108    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.740113    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.742355    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.742925    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:54.742933    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.742939    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.742944    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.744985    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.745483    3744 pod_ready.go:93] pod "etcd-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:54.745493    3744 pod_ready.go:82] duration metric: took 5.421796ms for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.745499    3744 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.745536    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m03
	I0831 15:37:54.745541    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.745547    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.745550    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.747563    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.748056    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:54.748063    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.748069    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.748071    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.749754    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:54.750027    3744 pod_ready.go:93] pod "etcd-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:54.750036    3744 pod_ready.go:82] duration metric: took 4.531272ms for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.750045    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.924527    3744 request.go:632] Waited for 174.448251ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:37:54.924561    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:37:54.924565    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.924570    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.924576    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.926540    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:55.124217    3744 request.go:632] Waited for 197.191409ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:55.124320    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:55.124331    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:55.124342    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:55.124349    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:55.127699    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:55.127979    3744 pod_ready.go:93] pod "kube-apiserver-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:55.127988    3744 pod_ready.go:82] duration metric: took 377.933462ms for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:55.127995    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:55.323995    3744 request.go:632] Waited for 195.947787ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:37:55.324122    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:37:55.324133    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:55.324142    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:55.324147    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:55.326536    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:55.524340    3744 request.go:632] Waited for 197.377407ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:55.524428    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:55.524437    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:55.524444    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:55.524458    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:55.527694    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:55.528065    3744 pod_ready.go:93] pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:55.528075    3744 pod_ready.go:82] duration metric: took 400.071053ms for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:55.528082    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:55.724069    3744 request.go:632] Waited for 195.89026ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:37:55.724147    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:37:55.724160    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:55.724178    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:55.724193    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:55.727264    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:55.924174    3744 request.go:632] Waited for 196.444661ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:55.924262    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:55.924273    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:55.924284    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:55.924290    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:55.927217    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:55.927667    3744 pod_ready.go:93] pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:55.927677    3744 pod_ready.go:82] duration metric: took 399.585518ms for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:55.927691    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:56.123773    3744 request.go:632] Waited for 195.997614ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:37:56.123824    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:37:56.123834    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:56.123859    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:56.123868    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:56.126826    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:56.323602    3744 request.go:632] Waited for 196.242245ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:56.323669    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:56.323713    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:56.323725    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:56.323736    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:56.326205    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:56.326487    3744 pod_ready.go:93] pod "kube-controller-manager-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:56.326497    3744 pod_ready.go:82] duration metric: took 398.79568ms for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:56.326504    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:56.525262    3744 request.go:632] Waited for 198.697997ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:56.525404    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:56.525415    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:56.525426    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:56.525435    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:56.528812    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:56.723576    3744 request.go:632] Waited for 194.289214ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:56.723635    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:56.723642    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:56.723648    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:56.723664    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:56.725655    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:56.726101    3744 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:56.726110    3744 pod_ready.go:82] duration metric: took 399.596067ms for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:56.726117    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:56.923811    3744 request.go:632] Waited for 197.624636ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:37:56.923859    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:37:56.923866    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:56.923874    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:56.923879    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:56.926307    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:57.123874    3744 request.go:632] Waited for 197.165319ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:57.123948    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:57.123956    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:57.123964    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:57.123981    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:57.126673    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:57.127130    3744 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:57.127139    3744 pod_ready.go:82] duration metric: took 401.01276ms for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:57.127146    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:57.323575    3744 request.go:632] Waited for 196.38297ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:37:57.323627    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:37:57.323635    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:57.323646    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:57.323654    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:57.326792    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:57.524981    3744 request.go:632] Waited for 197.675488ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:57.525056    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:57.525064    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:57.525072    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:57.525077    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:57.527436    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:57.527834    3744 pod_ready.go:93] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:57.527844    3744 pod_ready.go:82] duration metric: took 400.687607ms for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:57.527851    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:57.724761    3744 request.go:632] Waited for 196.867729ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-d45q5
	I0831 15:37:57.724843    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-d45q5
	I0831 15:37:57.724852    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:57.724860    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:57.724864    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:57.727338    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:57.924277    3744 request.go:632] Waited for 196.366483ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:57.924352    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:57.924361    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:57.924369    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:57.924376    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:57.926744    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:57.927036    3744 pod_ready.go:93] pod "kube-proxy-d45q5" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:57.927045    3744 pod_ready.go:82] duration metric: took 399.185058ms for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:57.927052    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:58.123932    3744 request.go:632] Waited for 196.831846ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:58.124040    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:58.124050    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:58.124062    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:58.124067    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:58.127075    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:58.323899    3744 request.go:632] Waited for 196.438465ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:58.323934    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:58.323939    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:58.323946    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:58.323982    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:58.326076    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:58.326347    3744 pod_ready.go:93] pod "kube-proxy-q7ndn" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:58.326358    3744 pod_ready.go:82] duration metric: took 399.29367ms for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:58.326365    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:58.524333    3744 request.go:632] Waited for 197.864558ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:37:58.524448    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:37:58.524460    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:58.524471    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:58.524478    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:58.527937    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:58.724668    3744 request.go:632] Waited for 196.043209ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:58.724763    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:58.724780    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:58.724797    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:58.724815    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:58.727732    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:58.728090    3744 pod_ready.go:93] pod "kube-scheduler-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:58.728099    3744 pod_ready.go:82] duration metric: took 401.725065ms for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:58.728105    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:58.925170    3744 request.go:632] Waited for 197.0037ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:37:58.925325    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:37:58.925339    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:58.925351    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:58.925358    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:58.928967    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:59.124043    3744 request.go:632] Waited for 194.666869ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:59.124133    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:59.124143    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:59.124154    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:59.124161    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:59.127137    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:59.127523    3744 pod_ready.go:93] pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:59.127532    3744 pod_ready.go:82] duration metric: took 399.417767ms for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:59.127541    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:59.324020    3744 request.go:632] Waited for 196.418346ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:37:59.324169    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:37:59.324180    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:59.324191    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:59.324200    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:59.327657    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:59.523961    3744 request.go:632] Waited for 195.650623ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:59.524073    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:59.524086    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:59.524097    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:59.524105    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:59.527091    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:59.527542    3744 pod_ready.go:93] pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:59.527550    3744 pod_ready.go:82] duration metric: took 399.999976ms for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:59.527558    3744 pod_ready.go:39] duration metric: took 22.814715363s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:37:59.527569    3744 api_server.go:52] waiting for apiserver process to appear ...
	I0831 15:37:59.527620    3744 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:37:59.540037    3744 api_server.go:72] duration metric: took 23.068203242s to wait for apiserver process to appear ...
	I0831 15:37:59.540049    3744 api_server.go:88] waiting for apiserver healthz status ...
	I0831 15:37:59.540059    3744 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0831 15:37:59.543113    3744 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0831 15:37:59.543146    3744 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0831 15:37:59.543150    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:59.543156    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:59.543161    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:59.543866    3744 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 15:37:59.543927    3744 api_server.go:141] control plane version: v1.31.0
	I0831 15:37:59.543936    3744 api_server.go:131] duration metric: took 3.882759ms to wait for apiserver health ...
	I0831 15:37:59.543942    3744 system_pods.go:43] waiting for kube-system pods to appear ...
	I0831 15:37:59.723587    3744 request.go:632] Waited for 179.596374ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:37:59.723694    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:37:59.723708    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:59.723718    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:59.723734    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:59.728359    3744 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:37:59.733656    3744 system_pods.go:59] 24 kube-system pods found
	I0831 15:37:59.733668    3744 system_pods.go:61] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:37:59.733672    3744 system_pods.go:61] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:37:59.733676    3744 system_pods.go:61] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:37:59.733679    3744 system_pods.go:61] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:37:59.733681    3744 system_pods.go:61] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:37:59.733684    3744 system_pods.go:61] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:37:59.733686    3744 system_pods.go:61] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:37:59.733689    3744 system_pods.go:61] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:37:59.733691    3744 system_pods.go:61] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:37:59.733694    3744 system_pods.go:61] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:37:59.733696    3744 system_pods.go:61] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:37:59.733699    3744 system_pods.go:61] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:37:59.733702    3744 system_pods.go:61] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:37:59.733705    3744 system_pods.go:61] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:37:59.733708    3744 system_pods.go:61] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:37:59.733710    3744 system_pods.go:61] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:37:59.733714    3744 system_pods.go:61] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:37:59.733718    3744 system_pods.go:61] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:37:59.733721    3744 system_pods.go:61] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:37:59.733724    3744 system_pods.go:61] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:37:59.733726    3744 system_pods.go:61] "kube-vip-ha-949000" [98967a2c-6641-4193-b7ce-c0fbdee58344] Running
	I0831 15:37:59.733729    3744 system_pods.go:61] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:37:59.733731    3744 system_pods.go:61] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:37:59.733734    3744 system_pods.go:61] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:37:59.733738    3744 system_pods.go:74] duration metric: took 189.789494ms to wait for pod list to return data ...
	I0831 15:37:59.733743    3744 default_sa.go:34] waiting for default service account to be created ...
	I0831 15:37:59.923784    3744 request.go:632] Waited for 189.987121ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:37:59.923870    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:37:59.923881    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:59.923893    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:59.923900    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:59.927288    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:59.927352    3744 default_sa.go:45] found service account: "default"
	I0831 15:37:59.927361    3744 default_sa.go:55] duration metric: took 193.611323ms for default service account to be created ...
	I0831 15:37:59.927366    3744 system_pods.go:116] waiting for k8s-apps to be running ...
	I0831 15:38:00.124803    3744 request.go:632] Waited for 197.388029ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:38:00.124898    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:38:00.124909    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:00.124920    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:00.124939    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:00.129956    3744 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0831 15:38:00.134973    3744 system_pods.go:86] 24 kube-system pods found
	I0831 15:38:00.134985    3744 system_pods.go:89] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:38:00.134989    3744 system_pods.go:89] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:38:00.134993    3744 system_pods.go:89] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:38:00.134996    3744 system_pods.go:89] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:38:00.134999    3744 system_pods.go:89] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:38:00.135002    3744 system_pods.go:89] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:38:00.135005    3744 system_pods.go:89] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:38:00.135008    3744 system_pods.go:89] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:38:00.135011    3744 system_pods.go:89] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:38:00.135013    3744 system_pods.go:89] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:38:00.135017    3744 system_pods.go:89] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:38:00.135019    3744 system_pods.go:89] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:38:00.135025    3744 system_pods.go:89] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:38:00.135028    3744 system_pods.go:89] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:38:00.135031    3744 system_pods.go:89] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:38:00.135034    3744 system_pods.go:89] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:38:00.135037    3744 system_pods.go:89] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:38:00.135039    3744 system_pods.go:89] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:38:00.135042    3744 system_pods.go:89] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:38:00.135045    3744 system_pods.go:89] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:38:00.135049    3744 system_pods.go:89] "kube-vip-ha-949000" [98967a2c-6641-4193-b7ce-c0fbdee58344] Running
	I0831 15:38:00.135051    3744 system_pods.go:89] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:38:00.135056    3744 system_pods.go:89] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:38:00.135060    3744 system_pods.go:89] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:38:00.135065    3744 system_pods.go:126] duration metric: took 207.692433ms to wait for k8s-apps to be running ...
	I0831 15:38:00.135070    3744 system_svc.go:44] waiting for kubelet service to be running ....
	I0831 15:38:00.135137    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:38:00.146618    3744 system_svc.go:56] duration metric: took 11.54297ms WaitForService to wait for kubelet
	I0831 15:38:00.146633    3744 kubeadm.go:582] duration metric: took 23.674794454s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:38:00.146650    3744 node_conditions.go:102] verifying NodePressure condition ...
	I0831 15:38:00.324468    3744 request.go:632] Waited for 177.772827ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0831 15:38:00.324541    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0831 15:38:00.324549    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:00.324557    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:00.324561    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:00.326804    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:38:00.327655    3744 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:38:00.327666    3744 node_conditions.go:123] node cpu capacity is 2
	I0831 15:38:00.327673    3744 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:38:00.327677    3744 node_conditions.go:123] node cpu capacity is 2
	I0831 15:38:00.327680    3744 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:38:00.327683    3744 node_conditions.go:123] node cpu capacity is 2
	I0831 15:38:00.327689    3744 node_conditions.go:105] duration metric: took 181.029342ms to run NodePressure ...
	I0831 15:38:00.327697    3744 start.go:241] waiting for startup goroutines ...
	I0831 15:38:00.327709    3744 start.go:255] writing updated cluster config ...
	I0831 15:38:00.348472    3744 out.go:201] 
	I0831 15:38:00.369311    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:38:00.369379    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:38:00.391565    3744 out.go:177] * Starting "ha-949000-m04" worker node in "ha-949000" cluster
	I0831 15:38:00.433358    3744 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:38:00.433417    3744 cache.go:56] Caching tarball of preloaded images
	I0831 15:38:00.433601    3744 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:38:00.433620    3744 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:38:00.433752    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:38:00.434936    3744 start.go:360] acquireMachinesLock for ha-949000-m04: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:38:00.435036    3744 start.go:364] duration metric: took 76.344µs to acquireMachinesLock for "ha-949000-m04"
	I0831 15:38:00.435061    3744 start.go:96] Skipping create...Using existing machine configuration
	I0831 15:38:00.435070    3744 fix.go:54] fixHost starting: m04
	I0831 15:38:00.435494    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:38:00.435519    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:38:00.444781    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51881
	I0831 15:38:00.445158    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:38:00.445521    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:38:00.445531    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:38:00.445763    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:38:00.445892    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:00.445989    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetState
	I0831 15:38:00.446076    3744 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:38:00.446156    3744 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3377
	I0831 15:38:00.447072    3744 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid 3377 missing from process table
	I0831 15:38:00.447102    3744 fix.go:112] recreateIfNeeded on ha-949000-m04: state=Stopped err=<nil>
	I0831 15:38:00.447112    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	W0831 15:38:00.447197    3744 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 15:38:00.468433    3744 out.go:177] * Restarting existing hyperkit VM for "ha-949000-m04" ...
	I0831 15:38:00.542198    3744 main.go:141] libmachine: (ha-949000-m04) Calling .Start
	I0831 15:38:00.542515    3744 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:38:00.542650    3744 main.go:141] libmachine: (ha-949000-m04) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid
	I0831 15:38:00.544312    3744 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid 3377 missing from process table
	I0831 15:38:00.544344    3744 main.go:141] libmachine: (ha-949000-m04) DBG | pid 3377 is in state "Stopped"
	I0831 15:38:00.544372    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid...
	I0831 15:38:00.544580    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Using UUID 5ee34770-2239-4427-9789-bd204fe095a6
	I0831 15:38:00.571913    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Generated MAC 8a:3c:61:5f:c5:84
	I0831 15:38:00.571940    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:38:00.572058    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"5ee34770-2239-4427-9789-bd204fe095a6", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003bec00)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:38:00.572092    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"5ee34770-2239-4427-9789-bd204fe095a6", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003bec00)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:38:00.572124    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "5ee34770-2239-4427-9789-bd204fe095a6", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/ha-949000-m04.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m04/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:38:00.572235    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 5ee34770-2239-4427-9789-bd204fe095a6 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/ha-949000-m04.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:38:00.572259    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:38:00.573709    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 DEBUG: hyperkit: Pid is 3806
	I0831 15:38:00.574064    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Attempt 0
	I0831 15:38:00.574112    3744 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:38:00.574129    3744 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3806
	I0831 15:38:00.576177    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Searching for 8a:3c:61:5f:c5:84 in /var/db/dhcpd_leases ...
	I0831 15:38:00.576262    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0831 15:38:00.576305    3744 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d4eca7}
	I0831 15:38:00.576319    3744 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ec75}
	I0831 15:38:00.576335    3744 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4ec63}
	I0831 15:38:00.576351    3744 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d4eb85}
	I0831 15:38:00.576382    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Found match: 8a:3c:61:5f:c5:84
	I0831 15:38:00.576399    3744 main.go:141] libmachine: (ha-949000-m04) DBG | IP: 192.169.0.8
	I0831 15:38:00.576410    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetConfigRaw
	I0831 15:38:00.577215    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:38:00.577389    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:38:00.577864    3744 machine.go:93] provisionDockerMachine start ...
	I0831 15:38:00.577878    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:00.578009    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:00.578108    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:00.578212    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:00.578342    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:00.578431    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:00.578558    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:38:00.578712    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:38:00.578720    3744 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 15:38:00.582294    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:38:00.590710    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:38:00.591705    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:38:00.591723    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:38:00.591734    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:38:00.591743    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:38:00.976655    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:38:00.976695    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:38:01.091423    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:01 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:38:01.091445    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:01 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:38:01.091527    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:01 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:38:01.091554    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:01 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:38:01.092272    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:01 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:38:01.092283    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:01 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:38:06.721349    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:06 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:38:06.721473    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:06 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:38:06.721482    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:06 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:38:06.745779    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:06 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:38:11.647284    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 15:38:11.647298    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetMachineName
	I0831 15:38:11.647457    3744 buildroot.go:166] provisioning hostname "ha-949000-m04"
	I0831 15:38:11.647468    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetMachineName
	I0831 15:38:11.647566    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:11.647657    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:11.647737    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:11.647830    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:11.647929    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:11.648056    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:38:11.648211    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:38:11.648224    3744 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m04 && echo "ha-949000-m04" | sudo tee /etc/hostname
	I0831 15:38:11.720881    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m04
	
	I0831 15:38:11.720895    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:11.721030    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:11.721141    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:11.721229    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:11.721323    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:11.721445    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:38:11.721583    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:38:11.721594    3744 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m04' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m04/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m04' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:38:11.787551    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:38:11.787565    3744 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:38:11.787574    3744 buildroot.go:174] setting up certificates
	I0831 15:38:11.787580    3744 provision.go:84] configureAuth start
	I0831 15:38:11.787586    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetMachineName
	I0831 15:38:11.787717    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:38:11.787807    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:11.787897    3744 provision.go:143] copyHostCerts
	I0831 15:38:11.787923    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:38:11.787987    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:38:11.787993    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:38:11.788130    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:38:11.788325    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:38:11.788370    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:38:11.788375    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:38:11.788450    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:38:11.788631    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:38:11.788686    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:38:11.788692    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:38:11.788777    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:38:11.788936    3744 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m04 san=[127.0.0.1 192.169.0.8 ha-949000-m04 localhost minikube]
	I0831 15:38:11.923616    3744 provision.go:177] copyRemoteCerts
	I0831 15:38:11.923670    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:38:11.923684    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:11.923822    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:11.923908    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:11.924002    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:11.924089    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:38:11.965052    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:38:11.965128    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:38:11.989075    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:38:11.989152    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:38:12.008938    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:38:12.009008    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0831 15:38:12.028923    3744 provision.go:87] duration metric: took 241.333371ms to configureAuth
	I0831 15:38:12.028939    3744 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:38:12.029131    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:38:12.029146    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:12.029282    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:12.029361    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:12.029448    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:12.029527    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:12.029620    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:12.029746    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:38:12.029867    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:38:12.029874    3744 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:38:12.090450    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:38:12.090463    3744 buildroot.go:70] root file system type: tmpfs
	I0831 15:38:12.090535    3744 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:38:12.090548    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:12.090681    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:12.090786    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:12.090898    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:12.091016    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:12.091186    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:38:12.091325    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:38:12.091371    3744 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:38:12.161741    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	Environment=NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:38:12.161767    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:12.161902    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:12.161995    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:12.162101    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:12.162204    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:12.162325    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:38:12.162467    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:38:12.162479    3744 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:38:13.717080    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:38:13.717094    3744 machine.go:96] duration metric: took 13.139081069s to provisionDockerMachine
	I0831 15:38:13.717101    3744 start.go:293] postStartSetup for "ha-949000-m04" (driver="hyperkit")
	I0831 15:38:13.717109    3744 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:38:13.717123    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:13.717308    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:38:13.717321    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:13.717411    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:13.717514    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:13.717598    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:13.717686    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:38:13.753970    3744 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:38:13.757041    3744 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:38:13.757049    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:38:13.757147    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:38:13.757317    3744 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:38:13.757323    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:38:13.757520    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:38:13.764743    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:38:13.784545    3744 start.go:296] duration metric: took 67.430377ms for postStartSetup
	I0831 15:38:13.784594    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:13.784782    3744 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0831 15:38:13.784795    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:13.784891    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:13.784980    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:13.785074    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:13.785157    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:38:13.822419    3744 machine.go:197] restoring vm config from /var/lib/minikube/backup: [etc]
	I0831 15:38:13.822478    3744 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0831 15:38:13.856251    3744 fix.go:56] duration metric: took 13.421034183s for fixHost
	I0831 15:38:13.856276    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:13.856412    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:13.856504    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:13.856591    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:13.856670    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:13.856794    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:38:13.856933    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:38:13.856940    3744 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:38:13.917606    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143893.981325007
	
	I0831 15:38:13.917619    3744 fix.go:216] guest clock: 1725143893.981325007
	I0831 15:38:13.917634    3744 fix.go:229] Guest: 2024-08-31 15:38:13.981325007 -0700 PDT Remote: 2024-08-31 15:38:13.856265 -0700 PDT m=+124.128653576 (delta=125.060007ms)
	I0831 15:38:13.917650    3744 fix.go:200] guest clock delta is within tolerance: 125.060007ms
	I0831 15:38:13.917655    3744 start.go:83] releasing machines lock for "ha-949000-m04", held for 13.482464465s
	I0831 15:38:13.917676    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:13.917802    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:38:13.942019    3744 out.go:177] * Found network options:
	I0831 15:38:13.963076    3744 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	W0831 15:38:13.984049    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:38:13.984067    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:38:13.984075    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:38:13.984086    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:13.984514    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:13.984633    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:13.984692    3744 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:38:13.984722    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	W0831 15:38:13.984773    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:38:13.984786    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:38:13.984810    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	W0831 15:38:13.984809    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:38:13.984873    3744 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:38:13.984894    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:13.984907    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:13.984995    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:13.985009    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:13.985085    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:13.985105    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:38:13.985186    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:13.985271    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	W0831 15:38:14.024342    3744 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:38:14.024407    3744 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:38:14.067158    3744 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:38:14.067173    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:38:14.067244    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:38:14.082520    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:38:14.090779    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:38:14.099040    3744 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:38:14.099091    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:38:14.107242    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:38:14.115660    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:38:14.124011    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:38:14.132309    3744 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:38:14.140696    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:38:14.149089    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:38:14.157409    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:38:14.165662    3744 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:38:14.173102    3744 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:38:14.180728    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:38:14.276483    3744 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:38:14.296705    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:38:14.296785    3744 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:38:14.312751    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:38:14.325397    3744 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:38:14.342774    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:38:14.353024    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:38:14.363251    3744 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:38:14.380028    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:38:14.390424    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:38:14.405244    3744 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:38:14.408231    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:38:14.415934    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:38:14.429648    3744 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:38:14.529094    3744 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:38:14.646662    3744 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:38:14.646690    3744 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:38:14.660870    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:38:14.760474    3744 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:38:17.038586    3744 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.278065529s)
	I0831 15:38:17.038650    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:38:17.049008    3744 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:38:17.062620    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:38:17.073607    3744 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:38:17.168850    3744 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:38:17.269764    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:38:17.377489    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:38:17.390666    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:38:17.402072    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:38:17.507294    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:38:17.568987    3744 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:38:17.569066    3744 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:38:17.574853    3744 start.go:563] Will wait 60s for crictl version
	I0831 15:38:17.574909    3744 ssh_runner.go:195] Run: which crictl
	I0831 15:38:17.578814    3744 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:38:17.605368    3744 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:38:17.605446    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:38:17.624343    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:38:17.679051    3744 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:38:17.753456    3744 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:38:17.812386    3744 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0831 15:38:17.902651    3744 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	I0831 15:38:17.924439    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:38:17.924700    3744 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:38:17.928251    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:38:17.938446    3744 mustload.go:65] Loading cluster: ha-949000
	I0831 15:38:17.938620    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:38:17.938850    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:38:17.938873    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:38:17.947622    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51903
	I0831 15:38:17.948032    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:38:17.948446    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:38:17.948460    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:38:17.948674    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:38:17.948791    3744 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:38:17.948881    3744 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:38:17.948987    3744 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 3756
	I0831 15:38:17.950000    3744 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:38:17.950260    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:38:17.950294    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:38:17.959428    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51905
	I0831 15:38:17.959777    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:38:17.960145    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:38:17.960162    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:38:17.960360    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:38:17.960471    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:38:17.960562    3744 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.8
	I0831 15:38:17.960568    3744 certs.go:194] generating shared ca certs ...
	I0831 15:38:17.960576    3744 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:38:17.960771    3744 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:38:17.960844    3744 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:38:17.960854    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:38:17.960878    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:38:17.960897    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:38:17.960914    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:38:17.961001    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:38:17.961051    3744 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:38:17.961060    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:38:17.961098    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:38:17.961130    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:38:17.961166    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:38:17.961235    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:38:17.961269    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:38:17.961290    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:38:17.961312    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:38:17.961342    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:38:17.980971    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:38:18.000269    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:38:18.019936    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:38:18.039774    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:38:18.059357    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:38:18.078502    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:38:18.097967    3744 ssh_runner.go:195] Run: openssl version
	I0831 15:38:18.102444    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:38:18.111969    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:38:18.115584    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:38:18.115639    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:38:18.119889    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:38:18.129130    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:38:18.138067    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:38:18.141420    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:38:18.141464    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:38:18.145592    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:38:18.154725    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:38:18.163859    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:38:18.167695    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:38:18.167749    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:38:18.172178    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:38:18.181412    3744 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:38:18.184441    3744 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 15:38:18.184478    3744 kubeadm.go:934] updating node {m04 192.169.0.8 0 v1.31.0  false true} ...
	I0831 15:38:18.184543    3744 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m04 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.8
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:38:18.184588    3744 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:38:18.192672    3744 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.31.0: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.31.0': No such file or directory
	
	Initiating transfer...
	I0831 15:38:18.192722    3744 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.31.0
	I0831 15:38:18.201203    3744 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubeadm.sha256
	I0831 15:38:18.201203    3744 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl.sha256
	I0831 15:38:18.201205    3744 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubelet.sha256
	I0831 15:38:18.201219    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm -> /var/lib/minikube/binaries/v1.31.0/kubeadm
	I0831 15:38:18.201219    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl -> /var/lib/minikube/binaries/v1.31.0/kubectl
	I0831 15:38:18.201260    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:38:18.201327    3744 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubeadm
	I0831 15:38:18.201327    3744 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubectl
	I0831 15:38:18.213304    3744 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubeadm: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubeadm': No such file or directory
	I0831 15:38:18.213305    3744 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubectl: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubectl': No such file or directory
	I0831 15:38:18.213306    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet -> /var/lib/minikube/binaries/v1.31.0/kubelet
	I0831 15:38:18.213339    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl --> /var/lib/minikube/binaries/v1.31.0/kubectl (56381592 bytes)
	I0831 15:38:18.213339    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm --> /var/lib/minikube/binaries/v1.31.0/kubeadm (58290328 bytes)
	I0831 15:38:18.213434    3744 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubelet
	I0831 15:38:18.234959    3744 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubelet: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubelet': No such file or directory
	I0831 15:38:18.235000    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet --> /var/lib/minikube/binaries/v1.31.0/kubelet (76865848 bytes)
	I0831 15:38:18.870025    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system
	I0831 15:38:18.878175    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:38:18.892204    3744 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:38:18.906289    3744 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:38:18.909279    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:38:18.919652    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:38:19.014285    3744 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:38:19.030068    3744 start.go:235] Will wait 6m0s for node &{Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime: ControlPlane:false Worker:true}
	I0831 15:38:19.030257    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:38:19.052807    3744 out.go:177] * Verifying Kubernetes components...
	I0831 15:38:19.073469    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:38:19.170855    3744 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:38:19.775316    3744 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:38:19.775538    3744 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0xfc63c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:38:19.775580    3744 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:38:19.775737    3744 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m04" to be "Ready" ...
	I0831 15:38:19.775777    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:19.775783    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:19.775789    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:19.775793    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:19.778097    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:20.276562    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:20.276584    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:20.276613    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:20.276621    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:20.279146    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:20.777079    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:20.777090    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:20.777097    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:20.777101    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:20.779128    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:21.277261    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:21.277277    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:21.277283    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:21.277286    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:21.279452    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:21.776272    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:21.776285    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:21.776292    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:21.776295    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:21.778482    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:21.778547    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:22.276209    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:22.276224    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:22.276233    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:22.276239    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:22.278431    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:22.775916    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:22.775932    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:22.775939    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:22.775943    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:22.778178    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:23.276360    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:23.276381    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:23.276392    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:23.276398    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:23.279406    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:23.775977    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:23.775995    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:23.776032    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:23.776037    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:23.778193    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:24.277072    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:24.277087    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:24.277093    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:24.277097    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:24.279300    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:24.279384    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:24.777071    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:24.777083    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:24.777089    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:24.777093    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:24.779084    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:38:25.277302    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:25.277326    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:25.277343    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:25.277370    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:25.280739    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:25.777360    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:25.777375    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:25.777382    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:25.777386    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:25.779584    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:26.277703    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:26.277720    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:26.277728    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:26.277739    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:26.279789    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:26.279858    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:26.776231    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:26.776272    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:26.776280    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:26.776285    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:26.778315    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:27.276174    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:27.276188    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:27.276194    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:27.276197    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:27.278437    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:27.776689    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:27.776708    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:27.776717    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:27.776721    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:27.779053    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:28.276081    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:28.276100    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:28.276111    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:28.276117    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:28.279235    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:28.776709    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:28.776722    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:28.776728    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:28.776732    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:28.778876    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:28.778948    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:29.276276    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:29.276292    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:29.276300    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:29.276306    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:29.278917    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:29.776120    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:29.776137    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:29.776147    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:29.776153    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:29.778926    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:30.277099    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:30.277114    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:30.277119    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:30.277121    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:30.279209    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:30.776289    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:30.776306    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:30.776318    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:30.776325    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:30.778950    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:30.779042    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:31.277113    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:31.277129    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:31.277137    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:31.277142    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:31.279308    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:31.776871    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:31.776885    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:31.776892    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:31.776907    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:31.779110    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:32.276639    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:32.276666    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:32.276677    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:32.276709    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:32.279642    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:32.776964    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:32.777005    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:32.777013    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:32.777017    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:32.778916    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:38:33.276097    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:33.276113    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:33.276120    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:33.276123    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:33.278201    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:33.278323    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:33.778025    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:33.778051    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:33.778062    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:33.778067    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:33.781122    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:34.277596    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:34.277611    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:34.277617    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:34.277620    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:34.279507    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:38:34.776042    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:34.776055    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:34.776061    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:34.776064    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:34.778134    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:35.276180    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:35.276203    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:35.276281    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:35.276292    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:35.279248    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:35.279324    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:35.776557    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:35.776577    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:35.776588    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:35.776595    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:35.779906    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:36.276525    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:36.276541    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:36.276547    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:36.276550    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:36.278734    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:36.776417    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:36.776489    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:36.776512    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:36.776522    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:36.779524    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:37.277720    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:37.277733    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:37.277739    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:37.277743    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:37.279925    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:37.279984    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:37.777252    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:37.777269    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:37.777274    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:37.777277    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:37.779271    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:38:38.278156    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:38.278211    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:38.278223    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:38.278229    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:38.280712    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:38.776178    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:38.776203    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:38.776213    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:38.776219    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:38.779093    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:39.276872    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:39.276885    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:39.276892    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:39.276896    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:39.279063    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:39.776884    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:39.776898    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:39.776905    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:39.776909    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:39.779259    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:39.779362    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:40.277202    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:40.277230    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:40.277242    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:40.277249    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:40.280416    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:40.776384    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:40.776396    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:40.776403    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:40.776406    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:40.778591    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:41.276444    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:41.276465    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:41.276477    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:41.276482    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:41.279236    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:41.777547    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:41.777633    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:41.777648    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:41.777658    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:41.780834    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:41.780914    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:42.276626    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:42.276639    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:42.276646    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:42.276649    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:42.278771    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:42.777502    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:42.777527    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:42.777539    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:42.777544    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:42.780668    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:43.277508    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:43.277532    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:43.277544    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:43.277551    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:43.281198    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:43.777290    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:43.777306    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:43.777313    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:43.777316    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:43.779556    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:44.277098    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:44.277133    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:44.277144    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:44.277150    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:44.280482    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:44.280570    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:44.776182    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:44.776196    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:44.776204    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:44.776210    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:44.778630    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:45.276509    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:45.276522    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:45.276528    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:45.276540    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:45.278778    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:45.776791    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:45.776866    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:45.776879    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:45.776888    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:45.779812    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:46.277629    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:46.277650    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:46.277661    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:46.277669    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:46.280694    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:46.280771    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:46.776617    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:46.776632    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:46.776639    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:46.776644    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:46.778705    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:47.276853    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:47.276872    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:47.276881    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:47.276886    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:47.279224    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:47.777691    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:47.777716    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:47.777764    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:47.777772    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:47.780764    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:48.276263    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:48.276280    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:48.276286    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:48.276289    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:48.278387    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:48.776798    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:48.776866    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:48.776876    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:48.776880    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:48.779266    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:48.779328    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:49.277706    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:49.277731    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:49.277798    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:49.277809    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:49.280441    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:49.776295    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:49.776306    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:49.776312    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:49.776320    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:49.778554    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:50.278315    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:50.278338    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:50.278403    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:50.278414    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:50.281533    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:50.777763    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:50.777778    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:50.777787    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:50.777796    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:50.780173    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:50.780239    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:51.276316    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:51.276332    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:51.276338    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:51.276342    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:51.278631    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:51.776296    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:51.776316    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:51.776325    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:51.776330    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:51.778726    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:52.276790    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:52.276847    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:52.276864    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:52.276870    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:52.279948    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:52.777099    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:52.777115    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:52.777121    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:52.777126    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:52.779325    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:53.276819    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:53.276881    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:53.276895    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:53.276904    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:53.279807    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:53.279883    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:53.776517    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:53.776532    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:53.776539    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:53.776543    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:53.778686    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:54.276276    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:54.276289    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:54.276299    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:54.276302    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:54.278627    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:54.777871    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:54.777890    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:54.777900    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:54.777906    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:54.781132    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:55.276882    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:55.276901    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:55.276913    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:55.276919    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:55.280226    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:55.280299    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:55.777001    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:55.777014    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:55.777020    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:55.777023    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:55.779025    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:38:56.277691    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:56.277714    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:56.277726    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:56.277731    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:56.280819    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:56.778188    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:56.778247    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:56.778257    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:56.778262    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:56.780685    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:57.276330    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:57.276344    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:57.276350    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:57.276354    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:57.278527    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:57.776849    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:57.776867    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:57.776875    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:57.776880    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:57.779132    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:57.779222    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:58.276676    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:58.276715    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:58.276723    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:58.276727    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:58.278722    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:38:58.776823    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:58.776841    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:58.776847    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:58.776851    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:58.779004    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:59.277009    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:59.277030    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:59.277041    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:59.277049    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:59.280147    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:59.776972    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:59.776990    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:59.776999    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:59.777007    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:59.780392    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:59.780554    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:00.278237    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:00.278268    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:00.278275    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:00.278279    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:00.280339    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:00.776782    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:00.776803    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:00.776814    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:00.776819    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:00.780040    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:01.276687    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:01.276709    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:01.276717    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:01.276722    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:01.279213    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:01.776982    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:01.776997    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:01.777004    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:01.777008    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:01.779255    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:02.278179    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:02.278239    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:02.278253    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:02.278261    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:02.281537    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:02.281611    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:02.776749    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:02.776775    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:02.776786    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:02.776793    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:02.780000    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:03.278084    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:03.278100    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:03.278108    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:03.278112    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:03.280525    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:03.776918    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:03.776956    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:03.777002    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:03.777009    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:03.780638    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:04.278461    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:04.278485    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:04.278497    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:04.278502    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:04.281718    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:04.281791    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:04.776376    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:04.776389    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:04.776395    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:04.776398    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:04.778509    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:05.276848    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:05.276872    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:05.276883    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:05.276889    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:05.279765    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:05.776397    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:05.776423    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:05.776433    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:05.776439    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:05.779793    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:06.277954    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:06.277969    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:06.277977    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:06.277981    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:06.280446    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:06.777008    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:06.777039    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:06.777100    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:06.777109    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:06.780058    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:06.780166    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:07.277181    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:07.277203    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:07.277217    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:07.277222    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:07.280356    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:07.777895    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:07.777940    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:07.777952    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:07.777957    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:07.780087    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:08.276718    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:08.276745    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:08.276757    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:08.276763    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:08.279711    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:08.777099    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:08.777121    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:08.777132    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:08.777137    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:08.780212    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:08.780293    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:09.277158    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:09.277177    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:09.277183    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:09.277188    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:09.279328    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:09.776613    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:09.776624    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:09.776630    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:09.776635    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:09.778784    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:10.276643    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:10.276662    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:10.276674    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:10.276682    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:10.279706    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:10.776484    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:10.776495    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:10.776501    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:10.776504    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:10.778860    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:11.276917    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:11.276968    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:11.276981    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:11.276990    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:11.280015    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:11.280097    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:11.777726    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:11.777745    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:11.777753    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:11.777758    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:11.780176    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:12.278046    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:12.278058    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:12.278063    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:12.278067    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:12.280005    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:12.777919    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:12.777945    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:12.777992    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:12.777997    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:12.780507    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:13.278486    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:13.278543    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:13.278554    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:13.278559    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:13.281627    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:13.281745    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:13.776833    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:13.776854    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:13.776862    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:13.776866    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:13.779535    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:14.276922    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:14.276946    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:14.276958    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:14.276966    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:14.280174    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:14.776595    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:14.776617    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:14.776629    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:14.776634    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:14.779819    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:15.278222    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:15.278239    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:15.278247    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:15.278251    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:15.280553    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:15.776940    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:15.776965    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:15.776977    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:15.776983    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:15.780495    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:15.780576    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:16.276617    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:16.276642    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:16.276652    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:16.276656    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:16.279277    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:16.776588    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:16.776609    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:16.776618    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:16.776622    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:16.778820    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:17.277548    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:17.277568    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:17.277581    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:17.277590    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:17.280937    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:17.776562    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:17.776588    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:17.776600    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:17.776607    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:17.780085    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:18.277015    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:18.277036    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:18.277048    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:18.277056    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:18.280239    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:18.280318    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:18.777930    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:18.777950    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:18.777961    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:18.777968    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:18.781102    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:19.278645    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:19.278671    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:19.278683    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:19.278689    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:19.282270    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:19.778214    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:19.778225    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:19.778230    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:19.778234    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:19.780140    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:20.277051    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:20.277079    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:20.277090    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:20.277098    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:20.280382    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:20.280543    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:20.776682    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:20.776706    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:20.776719    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:20.776724    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:20.780231    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:21.278070    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:21.278085    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:21.278092    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:21.278096    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:21.280488    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:21.776700    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:21.776723    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:21.776735    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:21.776743    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:21.779589    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:22.276945    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:22.276985    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:22.276996    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:22.277001    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:22.279378    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:22.777198    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:22.777217    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:22.777226    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:22.777230    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:22.779837    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:22.779899    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:23.277517    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:23.277532    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:23.277540    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:23.277546    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:23.280021    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:23.776652    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:23.776672    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:23.776680    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:23.776685    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:23.779129    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:24.277535    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:24.277618    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:24.277631    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:24.277637    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:24.280844    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:24.776736    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:24.776755    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:24.776767    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:24.776774    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:24.779817    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:25.277529    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:25.277549    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:25.277560    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:25.277564    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:25.280343    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:25.280414    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:25.777390    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:25.777407    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:25.777415    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:25.777419    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:25.779809    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:26.277450    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:26.277472    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:26.277485    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:26.277492    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:26.279869    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:26.776900    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:26.776921    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:26.776929    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:26.776934    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:26.779045    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:27.277440    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:27.277457    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:27.277463    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:27.277467    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:27.279629    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:27.776631    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:27.776647    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:27.776655    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:27.776660    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:27.779236    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:27.779329    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:28.276659    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:28.276685    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:28.276697    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:28.276704    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:28.279990    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:28.777285    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:28.777319    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:28.777326    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:28.777330    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:28.779470    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:29.276786    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:29.276806    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:29.276818    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:29.276824    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:29.279639    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:29.777308    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:29.777319    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:29.777325    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:29.777328    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:29.779377    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:29.779445    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:30.277508    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:30.277524    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:30.277530    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:30.277535    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:30.279611    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:30.778698    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:30.778722    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:30.778737    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:30.778746    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:30.781722    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:31.276851    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:31.276867    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:31.276876    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:31.276888    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:31.279490    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:31.778105    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:31.778123    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:31.778133    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:31.778137    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:31.780442    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:31.780510    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:32.278437    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:32.278459    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:32.278471    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:32.278476    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:32.281165    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:32.778521    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:32.778597    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:32.778610    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:32.778618    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:32.781925    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:33.276802    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:33.276823    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:33.276832    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:33.276837    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:33.279437    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:33.777585    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:33.777608    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:33.777620    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:33.777629    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:33.780773    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:33.780858    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:34.277701    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:34.277717    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:34.277723    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:34.277726    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:34.279795    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:34.777419    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:34.777432    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:34.777438    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:34.777442    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:34.779621    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:35.276815    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:35.276837    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:35.276847    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:35.276852    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:35.279717    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:35.778287    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:35.778312    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:35.778399    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:35.778409    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:35.781136    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:35.781210    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:36.276900    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:36.276915    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:36.276922    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:36.276925    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:36.279177    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:36.777030    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:36.777056    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:36.777068    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:36.777075    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:36.780399    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:37.276789    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:37.276805    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:37.276814    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:37.276819    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:37.279300    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:37.777098    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:37.777112    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:37.777117    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:37.777121    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:37.779283    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:38.277802    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:38.277865    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:38.277876    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:38.277884    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:38.279839    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:38.279898    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:38.777985    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:38.778008    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:38.778021    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:38.778027    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:38.781190    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:39.278167    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:39.278215    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:39.278222    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:39.278227    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:39.280014    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:39.778411    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:39.778425    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:39.778433    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:39.778437    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:39.780400    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:40.276768    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:40.276779    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:40.276785    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:40.276789    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:40.278622    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:40.776752    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:40.776766    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:40.776792    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:40.776795    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:40.779016    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:40.779098    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:41.278166    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:41.278185    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:41.278202    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:41.278206    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:41.280453    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:41.776879    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:41.776941    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:41.776950    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:41.776956    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:41.779462    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:42.277893    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:42.277906    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:42.277912    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:42.277916    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:42.279774    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:42.776804    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:42.776825    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:42.776836    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:42.776841    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:42.780314    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:42.780388    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:43.277438    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:43.277453    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:43.277461    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:43.277466    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:43.279958    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:43.776777    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:43.776790    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:43.776796    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:43.776799    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:43.778854    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:44.277120    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:44.277141    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:44.277152    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:44.277167    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:44.280063    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:44.777870    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:44.777891    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:44.777902    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:44.777910    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:44.780670    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:44.780806    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:45.278440    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:45.278453    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:45.278459    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:45.278464    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:45.280687    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:45.776997    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:45.777022    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:45.777033    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:45.777045    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:45.779681    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:46.277720    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:46.277761    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:46.277771    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:46.277777    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:46.279827    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:46.777445    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:46.777460    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:46.777466    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:46.777469    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:46.779643    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:47.278055    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:47.278120    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:47.278134    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:47.278141    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:47.281004    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:47.281121    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:47.778899    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:47.778923    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:47.778933    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:47.779001    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:47.781920    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:48.278094    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:48.278140    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:48.278148    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:48.278153    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:48.280253    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:48.776917    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:48.776935    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:48.776947    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:48.776954    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:48.779870    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:49.277147    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:49.277168    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:49.277179    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:49.277186    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:49.279804    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:49.778489    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:49.778501    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:49.778508    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:49.778510    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:49.780670    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:49.780731    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:50.278221    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:50.278248    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:50.278302    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:50.278312    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:50.281268    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:50.777610    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:50.777650    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:50.777663    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:50.777672    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:50.780328    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:51.276863    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:51.276878    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:51.276884    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:51.276887    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:51.278829    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:51.778792    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:51.778815    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:51.778829    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:51.778836    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:51.782105    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:51.782175    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:52.277513    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:52.277532    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:52.277544    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:52.277550    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:52.280450    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:52.778374    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:52.778390    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:52.778396    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:52.778399    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:52.780416    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:53.277640    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:53.277659    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:53.277671    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:53.277677    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:53.280752    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:53.778971    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:53.779023    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:53.779036    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:53.779044    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:53.782509    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:53.782591    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:54.277469    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:54.277484    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:54.277491    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:54.277495    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:54.279585    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:54.778653    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:54.778675    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:54.778688    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:54.778708    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:54.781762    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:55.277208    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:55.277222    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:55.277263    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:55.277270    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:55.279152    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:55.777066    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:55.777102    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:55.777110    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:55.777115    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:55.779288    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:56.278230    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:56.278241    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:56.278248    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:56.278251    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:56.280389    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:56.280448    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:56.778057    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:56.778137    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:56.778151    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:56.778158    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:56.781449    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:57.277127    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:57.277141    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:57.277148    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:57.277151    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:57.279114    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:57.778467    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:57.778478    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:57.778485    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:57.778487    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:57.780611    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:58.277035    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:58.277048    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:58.277064    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:58.277069    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:58.284343    3744 round_trippers.go:574] Response Status: 404 Not Found in 7 milliseconds
	I0831 15:39:58.284416    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:58.778691    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:58.778707    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:58.778714    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:58.778718    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:58.780664    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:59.277786    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:59.277801    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:59.277810    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:59.277815    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:59.280162    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:59.777363    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:59.777389    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:59.777400    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:59.777417    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:59.780437    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:00.278216    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:00.278231    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:00.278238    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:00.278241    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:00.280398    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:00.777947    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:00.777973    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:00.777985    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:00.777992    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:00.780895    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:00.780963    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:01.277061    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:01.277081    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:01.277097    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:01.277105    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:01.280071    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:01.778574    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:01.778590    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:01.778596    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:01.778599    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:01.780602    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:02.277039    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:02.277051    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:02.277057    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:02.277060    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:02.279367    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:02.777088    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:02.777113    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:02.777124    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:02.777130    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:02.780010    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:03.277129    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:03.277143    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:03.277150    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:03.277155    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:03.279360    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:03.279419    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:03.779084    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:03.779111    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:03.779120    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:03.779131    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:03.782578    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:04.279084    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:04.279114    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:04.279193    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:04.279209    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:04.282418    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:04.777281    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:04.777294    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:04.777300    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:04.777304    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:04.779400    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:05.277304    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:05.277357    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:05.277370    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:05.277378    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:05.280048    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:05.280120    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:05.777871    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:05.777897    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:05.777908    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:05.777914    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:05.781308    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:06.278341    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:06.278357    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:06.278365    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:06.278369    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:06.280721    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:06.777260    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:06.777278    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:06.777313    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:06.777319    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:06.779441    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:07.277289    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:07.277354    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:07.277368    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:07.277376    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:07.280642    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:07.280705    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:07.777567    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:07.777583    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:07.777589    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:07.777591    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:07.779882    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:08.277957    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:08.277972    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:08.277980    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:08.277985    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:08.280280    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:08.777495    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:08.777523    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:08.777535    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:08.777541    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:08.780074    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:09.278397    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:09.278413    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:09.278419    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:09.278422    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:09.280703    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:09.280789    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:09.778365    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:09.778379    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:09.778388    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:09.778392    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:09.780762    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:10.277879    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:10.277891    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:10.277897    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:10.277900    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:10.279957    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:10.777727    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:10.777743    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:10.777749    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:10.777752    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:10.779982    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:11.277869    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:11.277892    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:11.277908    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:11.277916    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:11.281007    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:11.281122    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:11.777300    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:11.777325    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:11.777375    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:11.777385    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:11.780070    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:12.277419    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:12.277438    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:12.277444    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:12.277450    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:12.279959    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:12.778536    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:12.778559    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:12.778570    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:12.778577    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:12.782121    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:13.277460    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:13.277540    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:13.277558    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:13.277567    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:13.280462    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:13.777386    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:13.777406    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:13.777417    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:13.777423    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:13.779721    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:13.779795    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:14.278352    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:14.278373    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:14.278382    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:14.278386    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:14.280995    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:14.777911    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:14.777931    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:14.777944    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:14.777953    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:14.780609    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:15.277390    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:15.277406    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:15.277413    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:15.277418    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:15.279552    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:15.777171    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:15.777196    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:15.777208    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:15.777213    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:15.780616    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:15.780690    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:16.278393    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:16.278413    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:16.278423    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:16.278431    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:16.281087    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:16.777493    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:16.777505    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:16.777511    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:16.777514    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:16.779511    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:17.277948    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:17.277963    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:17.277971    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:17.277975    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:17.281263    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:17.777610    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:17.777635    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:17.777645    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:17.777652    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:17.780711    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:17.780810    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:18.278407    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:18.278427    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:18.278438    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:18.278445    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:18.280714    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:18.778225    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:18.778250    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:18.778258    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:18.778263    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:18.781566    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:19.278245    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:19.278271    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:19.278341    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:19.278351    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:19.281708    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:19.778206    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:19.778220    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:19.778226    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:19.778231    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:19.780309    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:20.277705    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:20.277724    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:20.277735    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:20.277743    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:20.280797    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:20.280880    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:20.777518    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:20.777542    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:20.777554    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:20.777562    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:20.780637    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:21.277649    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:21.277665    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:21.277671    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:21.277675    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:21.280074    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:21.778048    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:21.778072    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:21.778084    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:21.778090    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:21.781448    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:22.277500    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:22.277519    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:22.277530    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:22.277535    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:22.280641    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:22.778428    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:22.778446    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:22.778455    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:22.778461    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:22.780933    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:22.780991    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:23.277541    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:23.277605    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:23.277620    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:23.277627    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:23.280957    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:23.777433    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:23.777447    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:23.777454    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:23.777457    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:23.779506    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:24.277362    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:24.277385    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:24.277433    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:24.277440    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:24.280068    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:24.778081    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:24.778099    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:24.778111    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:24.778117    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:24.781146    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:24.781249    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:25.278144    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:25.278167    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:25.278178    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:25.278185    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:25.281087    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:25.778478    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:25.778499    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:25.778540    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:25.778545    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:25.780863    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:26.277292    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:26.277320    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:26.277335    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:26.277342    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:26.280115    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:26.777557    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:26.777573    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:26.777581    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:26.777585    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:26.779974    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:27.278458    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:27.278474    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:27.278481    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:27.278484    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:27.280521    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:27.280595    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:27.777967    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:27.777987    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:27.777996    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:27.778001    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:27.780485    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:28.277807    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:28.277826    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:28.277838    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:28.277846    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:28.280427    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:28.777498    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:28.777510    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:28.777516    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:28.777520    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:28.779847    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:29.277964    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:29.277985    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:29.277996    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:29.278002    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:29.280815    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:29.280906    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:29.778537    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:29.778559    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:29.778570    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:29.778575    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:29.781623    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:30.277396    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:30.277412    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:30.277420    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:30.277424    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:30.279862    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:30.778701    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:30.778785    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:30.778800    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:30.778808    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:30.781829    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:31.278707    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:31.278727    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:31.278738    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:31.278744    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:31.281658    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:31.281726    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:31.778169    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:31.778189    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:31.778199    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:31.778205    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:31.781541    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:32.277415    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:32.277446    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:32.277488    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:32.277498    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:32.280928    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:32.777636    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:32.777722    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:32.777736    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:32.777743    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:32.780331    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:33.278774    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:33.278793    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:33.278802    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:33.278807    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:33.281266    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:33.778581    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:33.778604    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:33.778615    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:33.778622    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:33.781819    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:33.781931    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:34.278488    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:34.278512    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:34.278538    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:34.278546    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:34.281635    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:34.777686    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:34.777700    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:34.777708    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:34.777713    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:34.780113    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:35.277895    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:35.277919    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:35.277930    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:35.277935    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:35.281263    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:35.777425    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:35.777449    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:35.777467    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:35.777477    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:35.780717    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:36.279317    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:36.279363    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:36.279373    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:36.279381    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:36.282024    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:36.282088    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:36.777443    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:36.777459    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:36.777468    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:36.777473    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:36.779899    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:37.278285    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:37.278300    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:37.278306    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:37.278311    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:37.280691    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:37.778439    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:37.778466    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:37.778477    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:37.778484    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:37.781678    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:38.279008    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:38.279038    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:38.279051    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:38.279059    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:38.282603    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:38.282694    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:38.778818    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:38.778844    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:38.778855    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:38.778861    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:38.783197    3744 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:40:39.278660    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:39.278672    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:39.278678    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:39.278681    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:39.280786    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:39.777503    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:39.777522    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:39.777535    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:39.777541    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:39.780544    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:40.278292    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:40.278317    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:40.278329    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:40.278337    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:40.281137    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:40.778006    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:40.778032    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:40.778057    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:40.778071    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:40.781057    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:40.781158    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:41.278405    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:41.278469    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:41.278519    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:41.278533    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:41.281715    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:41.777417    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:41.777432    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:41.777438    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:41.777441    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:41.779462    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:42.278930    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:42.278962    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:42.278969    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:42.278974    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:42.280885    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:42.778654    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:42.778673    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:42.778708    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:42.778714    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:42.781210    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:42.781277    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:43.277428    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:43.277444    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:43.277450    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:43.277454    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:43.279641    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:43.778230    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:43.778243    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:43.778248    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:43.778252    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:43.780641    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:44.278516    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:44.278536    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:44.278545    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:44.278550    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:44.280826    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:44.777524    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:44.777543    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:44.777554    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:44.777560    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:44.780897    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:45.279411    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:45.279427    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:45.279433    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:45.279436    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:45.281622    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:45.281684    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:45.779055    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:45.779071    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:45.779078    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:45.779081    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:45.780982    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:46.278769    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:46.278788    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:46.278794    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:46.278799    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:46.280873    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:46.779140    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:46.779158    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:46.779191    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:46.779195    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:46.781223    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:47.277666    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:47.277689    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:47.277725    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:47.277732    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:47.280012    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:47.778363    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:47.778385    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:47.778394    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:47.778399    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:47.780853    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:47.780917    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:48.277895    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:48.277909    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:48.277915    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:48.277917    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:48.279760    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:48.778443    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:48.778469    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:48.778480    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:48.778487    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:48.782101    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:49.278898    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:49.278941    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:49.278948    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:49.278952    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:49.280953    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:49.779196    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:49.779209    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:49.779218    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:49.779222    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:49.781778    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:49.781836    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:50.277693    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:50.277708    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:50.277717    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:50.277721    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:50.279726    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:50.778035    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:50.778058    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:50.778070    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:50.778079    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:50.781019    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:51.277510    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:51.277549    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:51.277564    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:51.277567    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:51.279483    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:51.779060    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:51.779084    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:51.779113    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:51.779118    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:51.781564    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:52.278175    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:52.278187    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:52.278193    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:52.278197    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:52.280098    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:52.280167    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:52.778702    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:52.778717    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:52.778726    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:52.778730    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:52.781143    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:53.278862    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:53.278918    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:53.278925    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:53.278930    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:53.281004    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:53.779375    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:53.779401    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:53.779412    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:53.779418    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:53.783259    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:54.279308    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:54.279324    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:54.279331    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:54.279334    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:54.281428    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:54.281496    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:54.778158    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:54.778177    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:54.778191    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:54.778198    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:54.781197    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:55.277649    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:55.277663    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:55.277668    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:55.277672    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:55.279760    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:55.779263    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:55.779320    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:55.779330    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:55.779335    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:55.781789    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:56.278112    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:56.278124    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:56.278129    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:56.278133    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:56.280134    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:56.777990    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:56.778010    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:56.778022    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:56.778032    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:56.781213    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:56.781297    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:57.279143    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:57.279158    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:57.279164    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:57.279168    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:57.280873    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:57.778857    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:57.778881    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:57.778893    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:57.778900    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:57.781501    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:58.278300    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:58.278312    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:58.278318    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:58.278324    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:58.280252    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:58.779105    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:58.779139    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:58.779146    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:58.779150    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:58.780953    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:59.277772    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:59.277810    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:59.277817    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:59.277821    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:59.279828    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:59.279892    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:59.778306    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:59.778323    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:59.778334    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:59.778339    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:59.781562    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:00.279133    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:00.279147    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:00.279154    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:00.279157    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:00.280989    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:00.777674    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:00.777695    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:00.777706    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:00.777714    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:00.780625    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:01.278700    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:01.278712    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:01.278718    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:01.278722    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:01.280680    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:01.280741    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:01.778674    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:01.778695    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:01.778704    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:01.778709    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:01.781227    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:02.278744    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:02.278759    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:02.278764    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:02.278767    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:02.280603    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:02.778554    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:02.778581    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:02.778646    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:02.778654    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:02.781844    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:03.277956    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:03.277979    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:03.277986    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:03.277990    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:03.279854    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:03.777682    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:03.777698    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:03.777704    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:03.777707    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:03.780161    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:03.780218    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:04.278517    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:04.278529    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:04.278535    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:04.278537    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:04.280362    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:04.778969    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:04.778980    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:04.778986    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:04.778989    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:04.782704    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:05.278152    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:05.278165    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:05.278170    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:05.278173    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:05.280542    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:05.778218    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:05.778303    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:05.778320    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:05.778329    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:05.781501    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:05.781585    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:06.277766    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:06.277778    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:06.277784    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:06.277787    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:06.279880    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:06.778001    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:06.778021    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:06.778033    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:06.778039    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:06.781121    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:07.278457    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:07.278468    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:07.278474    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:07.278478    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:07.280352    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:07.778000    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:07.778020    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:07.778031    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:07.778037    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:07.781054    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:08.277960    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:08.277972    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:08.277978    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:08.277982    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:08.279721    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:08.279790    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:08.777988    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:08.778006    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:08.778014    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:08.778019    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:08.780429    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:09.278866    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:09.278887    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:09.278894    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:09.278898    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:09.280928    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:09.777942    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:09.777961    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:09.777972    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:09.777978    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:09.781177    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:10.279287    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:10.279340    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:10.279352    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:10.279358    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:10.281250    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:10.281309    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:10.779851    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:10.779871    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:10.779883    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:10.779888    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:10.783174    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:11.279199    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:11.279213    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:11.279219    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:11.279223    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:11.281075    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:11.778332    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:11.778350    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:11.778359    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:11.778363    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:11.780504    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:12.279627    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:12.279653    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:12.279665    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:12.279671    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:12.282520    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:12.282615    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:12.778391    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:12.778412    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:12.778424    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:12.778428    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:12.781446    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:13.278111    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:13.278123    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:13.278129    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:13.278132    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:13.279887    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:13.779218    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:13.779233    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:13.779239    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:13.779242    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:13.781352    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:14.277806    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:14.277818    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:14.277823    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:14.277827    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:14.279913    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:14.779779    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:14.779797    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:14.779808    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:14.779814    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:14.783141    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:14.783269    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:15.278003    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:15.278017    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:15.278023    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:15.278027    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:15.279730    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:15.778699    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:15.778720    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:15.778731    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:15.778737    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:15.781939    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:16.278805    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:16.278818    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:16.278846    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:16.278851    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:16.280696    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:16.778278    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:16.778298    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:16.778307    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:16.778312    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:16.780692    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:17.278010    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:17.278061    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:17.278071    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:17.278075    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:17.280183    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:17.280244    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:17.779658    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:17.779684    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:17.779696    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:17.779703    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:17.782964    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:18.279131    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:18.279146    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:18.279152    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:18.279155    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:18.281127    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:18.778591    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:18.778613    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:18.778624    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:18.778631    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:18.781947    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:19.278144    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:19.278156    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:19.278162    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:19.278165    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:19.280314    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:19.280374    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:19.779309    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:19.779328    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:19.779339    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:19.779346    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:19.782226    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:20.278897    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:20.278909    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:20.278914    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:20.278917    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:20.280839    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:20.779038    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:20.779071    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:20.779085    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:20.779095    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:20.782073    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:21.278315    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:21.278364    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:21.278371    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:21.278376    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:21.280407    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:21.280468    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:21.778122    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:21.778137    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:21.778144    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:21.778146    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:21.780207    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:22.278547    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:22.278561    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:22.278568    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:22.278571    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:22.280976    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:22.778009    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:22.778029    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:22.778040    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:22.778045    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:22.780889    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:23.278954    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:23.278999    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:23.279008    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:23.279011    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:23.283528    3744 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:41:23.283590    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:23.779486    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:23.779512    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:23.779523    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:23.779536    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:23.782922    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:24.277863    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:24.277876    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:24.277882    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:24.277885    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:24.279860    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:24.779167    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:24.779185    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:24.779196    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:24.779202    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:24.782106    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:25.279100    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:25.279120    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:25.279131    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:25.279139    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:25.282042    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:25.778565    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:25.778640    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:25.778655    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:25.778663    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:25.781719    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:25.781792    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:26.279146    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:26.279182    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:26.279223    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:26.279229    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:26.282148    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:26.778592    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:26.778614    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:26.778626    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:26.778632    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:26.782054    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:27.278821    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:27.278835    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:27.278842    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:27.278845    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:27.281364    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:27.778073    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:27.778100    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:27.778118    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:27.778125    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:27.781324    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:28.277935    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:28.277959    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:28.278022    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:28.278033    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:28.281297    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:28.281465    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:28.778608    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:28.778635    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:28.778648    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:28.778656    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:28.781848    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:29.278110    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:29.278132    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:29.278143    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:29.278148    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:29.281146    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:29.778251    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:29.778265    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:29.778273    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:29.778277    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:29.782398    3744 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:41:30.279687    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:30.279700    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:30.279708    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:30.279712    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:30.282090    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:30.282159    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:30.779599    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:30.779624    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:30.779636    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:30.779642    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:30.783210    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:31.279353    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:31.279366    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:31.279372    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:31.279376    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:31.281276    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:31.779671    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:31.779692    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:31.779704    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:31.779709    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:31.781611    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:32.279371    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:32.279395    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:32.279435    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:32.279442    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:32.282259    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:32.282329    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:32.779427    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:32.779446    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:32.779458    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:32.779463    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:32.782235    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:33.279452    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:33.279465    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:33.279471    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:33.279474    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:33.281321    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:33.778052    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:33.778072    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:33.778083    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:33.778089    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:33.781878    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:34.278548    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:34.278567    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:34.278575    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:34.278584    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:34.281417    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:34.779174    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:34.779193    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:34.779205    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:34.779210    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:34.782050    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:34.782115    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:35.278139    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:35.278152    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:35.278158    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:35.278162    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:35.279993    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:35.779363    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:35.779450    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:35.779465    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:35.779473    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:35.782313    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:36.278357    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:36.278383    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:36.278394    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:36.278400    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:36.281375    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:36.778762    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:36.778799    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:36.778808    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:36.778814    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:36.780954    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:37.279993    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:37.280053    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:37.280067    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:37.280075    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:37.282945    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:37.283021    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:37.779739    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:37.779783    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:37.779790    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:37.779796    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:37.781629    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:38.279147    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:38.279171    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:38.279184    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:38.279190    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:38.281843    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:38.778741    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:38.778764    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:38.778814    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:38.778819    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:38.781350    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:39.279360    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:39.279391    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:39.279399    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:39.279405    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:39.281151    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:39.778714    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:39.778733    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:39.778744    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:39.778752    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:39.781665    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:39.781800    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:40.278379    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:40.278393    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:40.278401    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:40.278405    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:40.280809    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:40.779343    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:40.779384    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:40.779392    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:40.779398    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:40.781388    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:41.279463    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:41.279490    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:41.279503    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:41.279508    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:41.282590    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:41.779242    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:41.779260    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:41.779267    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:41.779272    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:41.781369    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:42.279466    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:42.279483    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:42.279489    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:42.279492    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:42.281217    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:42.281311    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:42.778084    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:42.778101    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:42.778109    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:42.778112    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:42.780674    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:43.279061    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:43.279078    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:43.279088    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:43.279093    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:43.281059    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:43.779095    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:43.779129    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:43.779136    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:43.779138    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:43.781068    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:44.279029    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:44.279048    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:44.279058    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:44.279063    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:44.281431    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:44.281488    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:44.779540    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:44.779553    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:44.779562    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:44.779566    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:44.782120    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:45.278415    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:45.278429    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:45.278440    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:45.278444    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:45.280960    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:45.778255    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:45.778298    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:45.778305    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:45.778309    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:45.780347    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:46.279010    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:46.279030    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:46.279041    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:46.279046    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:46.282148    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:46.282239    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:46.779747    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:46.779768    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:46.779776    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:46.779782    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:46.782151    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:47.278274    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:47.278298    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:47.278339    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:47.278345    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:47.280731    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:47.778365    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:47.778390    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:47.778399    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:47.778408    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:47.781184    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:48.279756    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:48.279775    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:48.279785    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:48.279789    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:48.282380    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:48.282440    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:48.780165    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:48.780186    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:48.780197    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:48.780205    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:48.783195    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:49.278649    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:49.278669    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:49.278680    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:49.278685    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:49.281793    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:49.780041    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:49.780056    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:49.780064    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:49.780069    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:49.782464    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:50.278528    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:50.278538    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:50.278545    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:50.278549    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:50.280284    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:50.778556    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:50.778582    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:50.778591    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:50.778596    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:50.781794    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:50.781879    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:51.278412    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:51.278448    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:51.278456    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:51.278459    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:51.280359    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:51.778770    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:51.778852    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:51.778867    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:51.778876    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:51.781710    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:52.279069    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:52.279089    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:52.279101    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:52.279107    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:52.282688    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:52.778612    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:52.778627    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:52.778634    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:52.778636    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:52.780790    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:53.278839    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:53.278918    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:53.278932    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:53.278939    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:53.281791    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:53.281864    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:53.778930    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:53.778944    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:53.778953    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:53.778998    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:53.780750    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:54.279141    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:54.279163    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:54.279202    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:54.279208    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:54.281212    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:54.780288    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:54.780307    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:54.780318    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:54.780326    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:54.783446    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:55.278636    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:55.278655    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:55.278669    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:55.278675    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:55.281304    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:55.778487    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:55.778506    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:55.778513    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:55.778517    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:55.780794    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:55.780852    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:56.279529    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:56.279542    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:56.279548    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:56.279552    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:56.281403    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:56.779390    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:56.779415    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:56.779427    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:56.779435    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:56.782652    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:57.279730    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:57.279749    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:57.279775    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:57.279778    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:57.282199    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:57.778341    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:57.778353    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:57.778360    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:57.778364    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:57.780339    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:58.280180    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:58.280200    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:58.280208    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:58.280212    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:58.283270    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:58.283333    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:58.778922    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:58.778934    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:58.778941    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:58.778944    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:58.781165    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:59.278658    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:59.278670    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:59.278677    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:59.278680    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:59.280526    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:59.780251    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:59.780269    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:59.780278    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:59.780285    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:59.783254    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:00.278299    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:00.278311    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:00.278318    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:00.278321    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:00.280462    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:00.778333    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:00.778357    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:00.778417    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:00.778425    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:00.781396    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:00.781503    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:01.279261    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:01.279281    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:01.279292    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:01.279299    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:01.282233    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:01.778447    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:01.778464    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:01.778472    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:01.778476    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:01.780643    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:02.278526    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:02.278545    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:02.278557    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:02.278563    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:02.281443    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:02.778669    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:02.778693    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:02.778704    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:02.778709    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:02.782028    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:02.782104    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:03.278662    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:03.278675    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:03.278681    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:03.278684    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:03.281034    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:03.779554    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:03.779600    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:03.779611    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:03.779619    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:03.782537    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:04.278499    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:04.278522    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:04.278534    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:04.278542    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:04.281683    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:04.779122    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:04.779133    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:04.779140    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:04.779143    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:04.781151    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:42:05.279493    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:05.279515    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:05.279527    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:05.279535    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:05.283494    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:05.283569    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:05.779088    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:05.779167    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:05.779181    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:05.779187    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:05.782371    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:06.279314    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:06.279355    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:06.279363    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:06.279369    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:06.281532    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:06.779431    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:06.779454    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:06.779465    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:06.779473    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:06.782521    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:07.279058    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:07.279070    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:07.279078    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:07.279083    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:07.281403    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:07.780066    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:07.780081    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:07.780088    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:07.780092    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:07.782413    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:07.782477    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:08.278582    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:08.278601    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:08.278612    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:08.278617    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:08.281655    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:08.779457    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:08.779482    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:08.779494    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:08.779500    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:08.782874    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:09.278624    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:09.278660    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:09.278667    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:09.278671    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:09.280685    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:09.780183    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:09.780196    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:09.780204    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:09.780208    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:09.782479    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:09.782566    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:10.279033    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:10.279051    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:10.279074    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:10.279077    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:10.281035    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:42:10.778903    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:10.778916    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:10.778923    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:10.778926    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:10.781070    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:11.279519    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:11.279545    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:11.279587    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:11.279593    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:11.281932    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:11.780008    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:11.780026    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:11.780035    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:11.780041    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:11.782405    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:12.279415    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:12.279432    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:12.279438    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:12.279441    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:12.283584    3744 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:42:12.283720    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:12.779135    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:12.779163    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:12.779182    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:12.779192    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:12.782385    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:13.279985    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:13.280010    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:13.280074    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:13.280083    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:13.287032    3744 round_trippers.go:574] Response Status: 404 Not Found in 6 milliseconds
	I0831 15:42:13.778812    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:13.778824    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:13.778832    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:13.778837    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:13.780875    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:14.278421    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:14.278446    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:14.278459    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:14.278468    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:14.281130    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:14.778988    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:14.779006    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:14.779017    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:14.779025    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:14.782314    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:14.782385    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:15.279457    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:15.279477    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:15.279486    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:15.279492    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:15.281789    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:15.779420    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:15.779447    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:15.779459    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:15.779465    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:15.782822    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:16.278493    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:16.278512    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:16.278521    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:16.278526    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:16.280744    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:16.779399    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:16.779415    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:16.779421    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:16.779424    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:16.781497    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:17.279997    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:17.280026    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:17.280038    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:17.280046    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:17.283600    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:17.283684    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:17.778578    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:17.778593    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:17.778641    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:17.778645    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:17.780800    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:18.279627    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:18.279643    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:18.279650    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:18.279653    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:18.281669    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:18.778603    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:18.778615    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:18.778621    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:18.778625    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:18.781667    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:19.279738    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:19.279765    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:19.279777    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:19.279785    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:19.282926    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:19.779767    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:19.779781    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:19.779788    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:19.779791    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:19.781778    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:42:19.781867    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:19.781882    3744 node_ready.go:38] duration metric: took 4m0.003563812s for node "ha-949000-m04" to be "Ready" ...
	I0831 15:42:19.812481    3744 out.go:201] 
	W0831 15:42:19.833493    3744 out.go:270] X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	W0831 15:42:19.833512    3744 out.go:270] * 
	* 
	W0831 15:42:19.834711    3744 out.go:293] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0831 15:42:19.917735    3744 out.go:201] 

                                                
                                                
** /stderr **
ha_test.go:469: failed to run minikube start. args "out/minikube-darwin-amd64 node list -p ha-949000 -v=7 --alsologtostderr" : exit status 80
ha_test.go:472: (dbg) Run:  out/minikube-darwin-amd64 node list -p ha-949000
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-949000 -n ha-949000
helpers_test.go:245: <<< TestMultiControlPlane/serial/RestartClusterKeepsNodes FAILED: start of post-mortem logs <<<
helpers_test.go:246: ======>  post-mortem[TestMultiControlPlane/serial/RestartClusterKeepsNodes]: minikube logs <======
helpers_test.go:248: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 logs -n 25
helpers_test.go:248: (dbg) Done: out/minikube-darwin-amd64 -p ha-949000 logs -n 25: (3.729064283s)
helpers_test.go:253: TestMultiControlPlane/serial/RestartClusterKeepsNodes logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| kubectl | -p ha-949000 -- get pods -o          | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- get pods -o          | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5 -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- get pods -o          | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5 -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| node    | add -p ha-949000 -v=7                | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-949000 node stop m02 -v=7         | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:33 PDT | 31 Aug 24 15:33 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-949000 node start m02 -v=7        | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:34 PDT | 31 Aug 24 15:34 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-949000 -v=7               | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:35 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| stop    | -p ha-949000 -v=7                    | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:35 PDT | 31 Aug 24 15:36 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| start   | -p ha-949000 --wait=true -v=7        | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:36 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-949000                    | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:42 PDT |                     |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/08/31 15:36:09
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0831 15:36:09.764310    3744 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:36:09.764592    3744 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:36:09.764597    3744 out.go:358] Setting ErrFile to fd 2...
	I0831 15:36:09.764601    3744 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:36:09.764770    3744 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:36:09.766289    3744 out.go:352] Setting JSON to false
	I0831 15:36:09.790255    3744 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":2140,"bootTime":1725141629,"procs":434,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.6.1","kernelVersion":"23.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0831 15:36:09.790362    3744 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0831 15:36:09.812967    3744 out.go:177] * [ha-949000] minikube v1.33.1 on Darwin 14.6.1
	I0831 15:36:09.857017    3744 out.go:177]   - MINIKUBE_LOCATION=18943
	I0831 15:36:09.857063    3744 notify.go:220] Checking for updates...
	I0831 15:36:09.900714    3744 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:36:09.921979    3744 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0831 15:36:09.948841    3744 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0831 15:36:09.970509    3744 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:36:09.991512    3744 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0831 15:36:10.013794    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:36:10.013954    3744 driver.go:392] Setting default libvirt URI to qemu:///system
	I0831 15:36:10.014628    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:36:10.014709    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:36:10.024181    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51800
	I0831 15:36:10.024557    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:36:10.024973    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:36:10.024981    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:36:10.025208    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:36:10.025338    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:10.053425    3744 out.go:177] * Using the hyperkit driver based on existing profile
	I0831 15:36:10.095518    3744 start.go:297] selected driver: hyperkit
	I0831 15:36:10.095547    3744 start.go:901] validating driver "hyperkit" against &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime: ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:fals
e efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p20
00.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:36:10.095803    3744 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0831 15:36:10.095991    3744 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:36:10.096192    3744 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18943-957/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0831 15:36:10.105897    3744 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0831 15:36:10.111634    3744 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:36:10.111657    3744 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0831 15:36:10.114891    3744 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:36:10.114962    3744 cni.go:84] Creating CNI manager for ""
	I0831 15:36:10.114970    3744 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0831 15:36:10.115051    3744 start.go:340] cluster config:
	{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.16
9.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-t
iller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0
MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:36:10.115155    3744 iso.go:125] acquiring lock: {Name:mk6e91575b208577856769ef01f8e000bc57c787 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:36:10.157575    3744 out.go:177] * Starting "ha-949000" primary control-plane node in "ha-949000" cluster
	I0831 15:36:10.178565    3744 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:36:10.178634    3744 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4
	I0831 15:36:10.178661    3744 cache.go:56] Caching tarball of preloaded images
	I0831 15:36:10.178859    3744 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:36:10.178882    3744 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:36:10.179080    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:36:10.179968    3744 start.go:360] acquireMachinesLock for ha-949000: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:36:10.180093    3744 start.go:364] duration metric: took 100.253µs to acquireMachinesLock for "ha-949000"
	I0831 15:36:10.180125    3744 start.go:96] Skipping create...Using existing machine configuration
	I0831 15:36:10.180144    3744 fix.go:54] fixHost starting: 
	I0831 15:36:10.180570    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:36:10.180626    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:36:10.189873    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51802
	I0831 15:36:10.190215    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:36:10.190587    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:36:10.190602    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:36:10.190832    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:36:10.190956    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:10.191047    3744 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:36:10.191129    3744 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:36:10.191205    3744 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:36:10.192132    3744 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid 2887 missing from process table
	I0831 15:36:10.192166    3744 fix.go:112] recreateIfNeeded on ha-949000: state=Stopped err=<nil>
	I0831 15:36:10.192185    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	W0831 15:36:10.192270    3744 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 15:36:10.235417    3744 out.go:177] * Restarting existing hyperkit VM for "ha-949000" ...
	I0831 15:36:10.258400    3744 main.go:141] libmachine: (ha-949000) Calling .Start
	I0831 15:36:10.258670    3744 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:36:10.258717    3744 main.go:141] libmachine: (ha-949000) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid
	I0831 15:36:10.260851    3744 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid 2887 missing from process table
	I0831 15:36:10.260866    3744 main.go:141] libmachine: (ha-949000) DBG | pid 2887 is in state "Stopped"
	I0831 15:36:10.260894    3744 main.go:141] libmachine: (ha-949000) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid...
	I0831 15:36:10.261058    3744 main.go:141] libmachine: (ha-949000) DBG | Using UUID 98cab9ba-901d-49d1-9e6c-321a4533d56e
	I0831 15:36:10.370955    3744 main.go:141] libmachine: (ha-949000) DBG | Generated MAC ce:8:77:f7:42:5e
	I0831 15:36:10.370980    3744 main.go:141] libmachine: (ha-949000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:36:10.371093    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"98cab9ba-901d-49d1-9e6c-321a4533d56e", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003a6900)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:36:10.371127    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"98cab9ba-901d-49d1-9e6c-321a4533d56e", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003a6900)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:36:10.371175    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "98cab9ba-901d-49d1-9e6c-321a4533d56e", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd,earlyprintk=serial l
oglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:36:10.371220    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 98cab9ba-901d-49d1-9e6c-321a4533d56e -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset noresto
re waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:36:10.371232    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:36:10.372813    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 DEBUG: hyperkit: Pid is 3756
	I0831 15:36:10.373286    3744 main.go:141] libmachine: (ha-949000) DBG | Attempt 0
	I0831 15:36:10.373298    3744 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:36:10.373398    3744 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 3756
	I0831 15:36:10.375146    3744 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:36:10.375210    3744 main.go:141] libmachine: (ha-949000) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0831 15:36:10.375229    3744 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ebe4}
	I0831 15:36:10.375249    3744 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d4eb85}
	I0831 15:36:10.375272    3744 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d4eb32}
	I0831 15:36:10.375287    3744 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:36:10.375330    3744 main.go:141] libmachine: (ha-949000) DBG | Found match: ce:8:77:f7:42:5e
	I0831 15:36:10.375341    3744 main.go:141] libmachine: (ha-949000) Calling .GetConfigRaw
	I0831 15:36:10.375350    3744 main.go:141] libmachine: (ha-949000) DBG | IP: 192.169.0.5
	I0831 15:36:10.376038    3744 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:36:10.376245    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:36:10.376722    3744 machine.go:93] provisionDockerMachine start ...
	I0831 15:36:10.376735    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:10.376898    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:10.377023    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:10.377121    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:10.377226    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:10.377318    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:10.377457    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:10.377688    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:36:10.377699    3744 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 15:36:10.380749    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:36:10.432938    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:36:10.433650    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:36:10.433669    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:36:10.433677    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:36:10.433685    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:36:10.813736    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:36:10.813750    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:36:10.928786    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:36:10.928808    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:36:10.928820    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:36:10.928840    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:36:10.929718    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:36:10.929729    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:36:16.483580    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:16 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:36:16.483594    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:16 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:36:16.483602    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:16 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:36:16.508100    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:16 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:36:21.446393    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 15:36:21.446406    3744 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:36:21.446553    3744 buildroot.go:166] provisioning hostname "ha-949000"
	I0831 15:36:21.446562    3744 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:36:21.446665    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:21.446786    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:21.446905    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.447025    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.447124    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:21.447308    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:21.447472    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:36:21.447480    3744 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000 && echo "ha-949000" | sudo tee /etc/hostname
	I0831 15:36:21.524007    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000
	
	I0831 15:36:21.524025    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:21.524158    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:21.524268    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.524375    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.524479    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:21.524631    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:21.524781    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:36:21.524792    3744 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:36:21.591782    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:36:21.591802    3744 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:36:21.591822    3744 buildroot.go:174] setting up certificates
	I0831 15:36:21.591828    3744 provision.go:84] configureAuth start
	I0831 15:36:21.591834    3744 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:36:21.591970    3744 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:36:21.592077    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:21.592183    3744 provision.go:143] copyHostCerts
	I0831 15:36:21.592217    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:36:21.592287    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:36:21.592295    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:36:21.592443    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:36:21.592667    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:36:21.592706    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:36:21.592710    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:36:21.592784    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:36:21.592937    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:36:21.592978    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:36:21.592983    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:36:21.593095    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:36:21.593248    3744 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000 san=[127.0.0.1 192.169.0.5 ha-949000 localhost minikube]
	I0831 15:36:21.710940    3744 provision.go:177] copyRemoteCerts
	I0831 15:36:21.710993    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:36:21.711008    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:21.711135    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:21.711246    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.711328    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:21.711434    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:36:21.747436    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:36:21.747514    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:36:21.767330    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:36:21.767390    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1196 bytes)
	I0831 15:36:21.787147    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:36:21.787210    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0831 15:36:21.806851    3744 provision.go:87] duration metric: took 215.008206ms to configureAuth
	I0831 15:36:21.806864    3744 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:36:21.807028    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:36:21.807041    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:21.807176    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:21.807304    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:21.807387    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.807476    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.807574    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:21.807684    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:21.807812    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:36:21.807819    3744 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:36:21.869123    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:36:21.869137    3744 buildroot.go:70] root file system type: tmpfs
	I0831 15:36:21.869215    3744 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:36:21.869228    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:21.869368    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:21.869456    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.869553    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.869651    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:21.869776    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:21.869915    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:36:21.869959    3744 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:36:21.941116    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:36:21.941136    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:21.941270    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:21.941365    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.941441    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.941529    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:21.941663    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:21.941807    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:36:21.941819    3744 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:36:23.639328    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:36:23.639343    3744 machine.go:96] duration metric: took 13.26247014s to provisionDockerMachine
	I0831 15:36:23.639354    3744 start.go:293] postStartSetup for "ha-949000" (driver="hyperkit")
	I0831 15:36:23.639362    3744 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:36:23.639372    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:23.639572    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:36:23.639587    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:23.639684    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:23.639792    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:23.639927    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:23.640026    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:36:23.679356    3744 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:36:23.683676    3744 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:36:23.683690    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:36:23.683793    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:36:23.683980    3744 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:36:23.683987    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:36:23.684187    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:36:23.697074    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:36:23.724665    3744 start.go:296] duration metric: took 85.300709ms for postStartSetup
	I0831 15:36:23.724694    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:23.724869    3744 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0831 15:36:23.724883    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:23.724980    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:23.725089    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:23.725189    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:23.725280    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:36:23.763464    3744 machine.go:197] restoring vm config from /var/lib/minikube/backup: [etc]
	I0831 15:36:23.763527    3744 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0831 15:36:23.797396    3744 fix.go:56] duration metric: took 13.617113477s for fixHost
	I0831 15:36:23.797420    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:23.797554    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:23.797655    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:23.797749    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:23.797839    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:23.797970    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:23.798114    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:36:23.798122    3744 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:36:23.858158    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143783.919023246
	
	I0831 15:36:23.858170    3744 fix.go:216] guest clock: 1725143783.919023246
	I0831 15:36:23.858175    3744 fix.go:229] Guest: 2024-08-31 15:36:23.919023246 -0700 PDT Remote: 2024-08-31 15:36:23.79741 -0700 PDT m=+14.070978631 (delta=121.613246ms)
	I0831 15:36:23.858196    3744 fix.go:200] guest clock delta is within tolerance: 121.613246ms
	I0831 15:36:23.858200    3744 start.go:83] releasing machines lock for "ha-949000", held for 13.677948956s
	I0831 15:36:23.858225    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:23.858359    3744 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:36:23.858452    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:23.858730    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:23.858831    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:23.858919    3744 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:36:23.858951    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:23.858972    3744 ssh_runner.go:195] Run: cat /version.json
	I0831 15:36:23.858983    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:23.859063    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:23.859085    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:23.859194    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:23.859214    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:23.859295    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:23.859309    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:23.859385    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:36:23.859397    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:36:23.890757    3744 ssh_runner.go:195] Run: systemctl --version
	I0831 15:36:23.938659    3744 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0831 15:36:23.943864    3744 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:36:23.943901    3744 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:36:23.956026    3744 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:36:23.956039    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:36:23.956147    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:36:23.971422    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:36:23.980435    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:36:23.989142    3744 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:36:23.989181    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:36:23.997930    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:36:24.006635    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:36:24.015080    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:36:24.023671    3744 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:36:24.032589    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:36:24.041364    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:36:24.050087    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:36:24.058866    3744 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:36:24.066704    3744 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:36:24.074622    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:24.168184    3744 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:36:24.187633    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:36:24.187713    3744 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:36:24.206675    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:36:24.220212    3744 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:36:24.240424    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:36:24.250685    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:36:24.261052    3744 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:36:24.286854    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:36:24.297197    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:36:24.312454    3744 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:36:24.315602    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:36:24.323102    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:36:24.337130    3744 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:36:24.434813    3744 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:36:24.537809    3744 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:36:24.537887    3744 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:36:24.552112    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:24.656146    3744 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:36:26.992775    3744 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.336585914s)
	I0831 15:36:26.992844    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:36:27.003992    3744 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:36:27.018708    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:36:27.029918    3744 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:36:27.137311    3744 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:36:27.239047    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:27.342173    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:36:27.356192    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:36:27.367097    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:27.470187    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:36:27.536105    3744 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:36:27.536192    3744 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:36:27.540763    3744 start.go:563] Will wait 60s for crictl version
	I0831 15:36:27.540810    3744 ssh_runner.go:195] Run: which crictl
	I0831 15:36:27.544037    3744 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:36:27.570291    3744 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:36:27.570367    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:36:27.588378    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:36:27.648285    3744 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:36:27.648336    3744 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:36:27.648820    3744 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:36:27.653344    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:36:27.662997    3744 kubeadm.go:883] updating cluster {Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.
0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false f
reshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID
:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0831 15:36:27.663083    3744 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:36:27.663134    3744 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 15:36:27.676654    3744 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240813-c6f155d6
	registry.k8s.io/kube-apiserver:v1.31.0
	registry.k8s.io/kube-controller-manager:v1.31.0
	registry.k8s.io/kube-scheduler:v1.31.0
	registry.k8s.io/kube-proxy:v1.31.0
	registry.k8s.io/etcd:3.5.15-0
	registry.k8s.io/pause:3.10
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/coredns/coredns:v1.11.1
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0831 15:36:27.676670    3744 docker.go:615] Images already preloaded, skipping extraction
	I0831 15:36:27.676747    3744 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 15:36:27.690446    3744 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240813-c6f155d6
	registry.k8s.io/kube-apiserver:v1.31.0
	registry.k8s.io/kube-controller-manager:v1.31.0
	registry.k8s.io/kube-scheduler:v1.31.0
	registry.k8s.io/kube-proxy:v1.31.0
	registry.k8s.io/etcd:3.5.15-0
	registry.k8s.io/pause:3.10
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/coredns/coredns:v1.11.1
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0831 15:36:27.690466    3744 cache_images.go:84] Images are preloaded, skipping loading
	I0831 15:36:27.690484    3744 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.31.0 docker true true} ...
	I0831 15:36:27.690565    3744 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:36:27.690634    3744 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0831 15:36:27.729077    3744 cni.go:84] Creating CNI manager for ""
	I0831 15:36:27.729090    3744 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0831 15:36:27.729101    3744 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0831 15:36:27.729122    3744 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.31.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-949000 NodeName:ha-949000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0831 15:36:27.729202    3744 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-949000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.31.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0831 15:36:27.729215    3744 kube-vip.go:115] generating kube-vip config ...
	I0831 15:36:27.729267    3744 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:36:27.741901    3744 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:36:27.741972    3744 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:36:27.742025    3744 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:36:27.751754    3744 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:36:27.751799    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0831 15:36:27.759784    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0831 15:36:27.773166    3744 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:36:27.786640    3744 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0831 15:36:27.800639    3744 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:36:27.814083    3744 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:36:27.817014    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:36:27.827332    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:27.924726    3744 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:36:27.939552    3744 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.5
	I0831 15:36:27.939571    3744 certs.go:194] generating shared ca certs ...
	I0831 15:36:27.939581    3744 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:36:27.939767    3744 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:36:27.939836    3744 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:36:27.939848    3744 certs.go:256] generating profile certs ...
	I0831 15:36:27.939960    3744 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:36:27.939980    3744 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.f0a126f7
	I0831 15:36:27.939996    3744 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.f0a126f7 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.7 192.169.0.254]
	I0831 15:36:27.990143    3744 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.f0a126f7 ...
	I0831 15:36:27.990157    3744 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.f0a126f7: {Name:mkcaa83b4b223ea37e242b23bc80c554e3269eac Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:36:27.990861    3744 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.f0a126f7 ...
	I0831 15:36:27.990872    3744 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.f0a126f7: {Name:mk789cab6bc4fccb81a6d827e090943e3a032cb6 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:36:27.991117    3744 certs.go:381] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.f0a126f7 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt
	I0831 15:36:27.991353    3744 certs.go:385] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.f0a126f7 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key
	I0831 15:36:27.991605    3744 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:36:27.991615    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:36:27.991642    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:36:27.991663    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:36:27.991688    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:36:27.991706    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:36:27.991724    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:36:27.991744    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:36:27.991761    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:36:27.991852    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:36:27.991900    3744 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:36:27.991909    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:36:27.991937    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:36:27.991968    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:36:27.992001    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:36:27.992071    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:36:27.992107    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:36:27.992134    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:27.992153    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:36:27.992665    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:36:28.012619    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:36:28.037918    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:36:28.059676    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:36:28.085374    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0831 15:36:28.108665    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0831 15:36:28.134880    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:36:28.163351    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:36:28.189443    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:36:28.237208    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:36:28.275840    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:36:28.307738    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0831 15:36:28.327147    3744 ssh_runner.go:195] Run: openssl version
	I0831 15:36:28.332485    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:36:28.341869    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:28.345319    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:28.345361    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:28.356453    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:36:28.366034    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:36:28.375170    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:36:28.378621    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:36:28.378656    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:36:28.382855    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:36:28.392032    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:36:28.401330    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:36:28.404932    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:36:28.404981    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:36:28.409135    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:36:28.418467    3744 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:36:28.421857    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0831 15:36:28.426311    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0831 15:36:28.430575    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0831 15:36:28.435252    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0831 15:36:28.439597    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0831 15:36:28.443958    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0831 15:36:28.448329    3744 kubeadm.go:392] StartCluster: {Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 C
lusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false fres
hpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:do
cker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:36:28.448445    3744 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0831 15:36:28.461457    3744 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0831 15:36:28.469983    3744 kubeadm.go:408] found existing configuration files, will attempt cluster restart
	I0831 15:36:28.469994    3744 kubeadm.go:593] restartPrimaryControlPlane start ...
	I0831 15:36:28.470033    3744 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0831 15:36:28.478435    3744 kubeadm.go:130] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:36:28.478738    3744 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-949000" does not appear in /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:36:28.478830    3744 kubeconfig.go:62] /Users/jenkins/minikube-integration/18943-957/kubeconfig needs updating (will repair): [kubeconfig missing "ha-949000" cluster setting kubeconfig missing "ha-949000" context setting]
	I0831 15:36:28.479071    3744 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/kubeconfig: {Name:mkc7259a3f17d77b84078e55eed4ed8b5d2486ca Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:36:28.479445    3744 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:36:28.479626    3744 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.5:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, Use
rAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0xfc63c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0831 15:36:28.479933    3744 cert_rotation.go:140] Starting client certificate rotation controller
	I0831 15:36:28.480130    3744 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0831 15:36:28.488296    3744 kubeadm.go:630] The running cluster does not require reconfiguration: 192.169.0.5
	I0831 15:36:28.488308    3744 kubeadm.go:597] duration metric: took 18.310201ms to restartPrimaryControlPlane
	I0831 15:36:28.488312    3744 kubeadm.go:394] duration metric: took 39.987749ms to StartCluster
	I0831 15:36:28.488320    3744 settings.go:142] acquiring lock: {Name:mk4b1b0a7439feab82be8f6d66b4d3c4d11c9b5f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:36:28.488392    3744 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:36:28.488767    3744 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/kubeconfig: {Name:mkc7259a3f17d77b84078e55eed4ed8b5d2486ca Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:36:28.488978    3744 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:36:28.488992    3744 start.go:241] waiting for startup goroutines ...
	I0831 15:36:28.489001    3744 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0831 15:36:28.489144    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:36:28.531040    3744 out.go:177] * Enabled addons: 
	I0831 15:36:28.551931    3744 addons.go:510] duration metric: took 62.927579ms for enable addons: enabled=[]
	I0831 15:36:28.552016    3744 start.go:246] waiting for cluster config update ...
	I0831 15:36:28.552028    3744 start.go:255] writing updated cluster config ...
	I0831 15:36:28.574130    3744 out.go:201] 
	I0831 15:36:28.595598    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:36:28.595734    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:36:28.618331    3744 out.go:177] * Starting "ha-949000-m02" control-plane node in "ha-949000" cluster
	I0831 15:36:28.659956    3744 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:36:28.659989    3744 cache.go:56] Caching tarball of preloaded images
	I0831 15:36:28.660178    3744 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:36:28.660194    3744 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:36:28.660319    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:36:28.661341    3744 start.go:360] acquireMachinesLock for ha-949000-m02: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:36:28.661436    3744 start.go:364] duration metric: took 71.648µs to acquireMachinesLock for "ha-949000-m02"
	I0831 15:36:28.661461    3744 start.go:96] Skipping create...Using existing machine configuration
	I0831 15:36:28.661470    3744 fix.go:54] fixHost starting: m02
	I0831 15:36:28.661902    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:36:28.661926    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:36:28.670964    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51824
	I0831 15:36:28.671287    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:36:28.671608    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:36:28.671619    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:36:28.671857    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:36:28.671991    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:28.672109    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:36:28.672201    3744 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:36:28.672291    3744 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 3528
	I0831 15:36:28.673213    3744 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid 3528 missing from process table
	I0831 15:36:28.673240    3744 fix.go:112] recreateIfNeeded on ha-949000-m02: state=Stopped err=<nil>
	I0831 15:36:28.673248    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	W0831 15:36:28.673335    3744 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 15:36:28.714811    3744 out.go:177] * Restarting existing hyperkit VM for "ha-949000-m02" ...
	I0831 15:36:28.736047    3744 main.go:141] libmachine: (ha-949000-m02) Calling .Start
	I0831 15:36:28.736403    3744 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:36:28.736434    3744 main.go:141] libmachine: (ha-949000-m02) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid
	I0831 15:36:28.738213    3744 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid 3528 missing from process table
	I0831 15:36:28.738226    3744 main.go:141] libmachine: (ha-949000-m02) DBG | pid 3528 is in state "Stopped"
	I0831 15:36:28.738249    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid...
	I0831 15:36:28.738619    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Using UUID 23e5d675-5201-4f3d-86b7-b25c818528d1
	I0831 15:36:28.765315    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Generated MAC 92:7:3c:3f:ee:b7
	I0831 15:36:28.765332    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:36:28.765455    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"23e5d675-5201-4f3d-86b7-b25c818528d1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c0a20)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:36:28.765495    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"23e5d675-5201-4f3d-86b7-b25c818528d1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c0a20)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:36:28.765521    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "23e5d675-5201-4f3d-86b7-b25c818528d1", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:36:28.765553    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 23e5d675-5201-4f3d-86b7-b25c818528d1 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:36:28.765562    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:36:28.767165    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 DEBUG: hyperkit: Pid is 3763
	I0831 15:36:28.767495    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 0
	I0831 15:36:28.767509    3744 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:36:28.767583    3744 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 3763
	I0831 15:36:28.769355    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:36:28.769415    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0831 15:36:28.769450    3744 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4ec63}
	I0831 15:36:28.769473    3744 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ebe4}
	I0831 15:36:28.769487    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Found match: 92:7:3c:3f:ee:b7
	I0831 15:36:28.769498    3744 main.go:141] libmachine: (ha-949000-m02) DBG | IP: 192.169.0.6
	I0831 15:36:28.769505    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetConfigRaw
	I0831 15:36:28.770167    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:36:28.770374    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:36:28.770722    3744 machine.go:93] provisionDockerMachine start ...
	I0831 15:36:28.770732    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:28.770845    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:28.770937    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:28.771045    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:28.771147    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:28.771273    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:28.771413    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:28.771572    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:36:28.771580    3744 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 15:36:28.775197    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:36:28.783845    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:36:28.784655    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:36:28.784674    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:36:28.784685    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:36:28.784693    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:36:29.168717    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:36:29.168732    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:36:29.283641    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:36:29.283661    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:36:29.283712    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:36:29.283753    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:36:29.284560    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:36:29.284571    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:36:34.866750    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:34 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0831 15:36:34.866767    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:34 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0831 15:36:34.866778    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:34 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0831 15:36:34.891499    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:34 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0831 15:36:39.840129    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 15:36:39.840143    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:36:39.840307    3744 buildroot.go:166] provisioning hostname "ha-949000-m02"
	I0831 15:36:39.840319    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:36:39.840413    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:39.840489    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:39.840578    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:39.840665    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:39.840764    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:39.840907    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:39.841055    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:36:39.841064    3744 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m02 && echo "ha-949000-m02" | sudo tee /etc/hostname
	I0831 15:36:39.913083    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m02
	
	I0831 15:36:39.913098    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:39.913252    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:39.913377    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:39.913471    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:39.913560    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:39.913685    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:39.913826    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:36:39.913837    3744 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:36:39.987034    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:36:39.987048    3744 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:36:39.987056    3744 buildroot.go:174] setting up certificates
	I0831 15:36:39.987062    3744 provision.go:84] configureAuth start
	I0831 15:36:39.987067    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:36:39.987204    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:36:39.987310    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:39.987418    3744 provision.go:143] copyHostCerts
	I0831 15:36:39.987447    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:36:39.987493    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:36:39.987499    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:36:39.988044    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:36:39.988241    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:36:39.988272    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:36:39.988277    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:36:39.988347    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:36:39.988492    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:36:39.988529    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:36:39.988533    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:36:39.988597    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:36:39.988746    3744 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m02 san=[127.0.0.1 192.169.0.6 ha-949000-m02 localhost minikube]
	I0831 15:36:40.055665    3744 provision.go:177] copyRemoteCerts
	I0831 15:36:40.055717    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:36:40.055733    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:40.055998    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:40.056098    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:40.056185    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:40.056277    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:36:40.095370    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:36:40.095446    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:36:40.115272    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:36:40.115336    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:36:40.134845    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:36:40.134920    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0831 15:36:40.154450    3744 provision.go:87] duration metric: took 167.380587ms to configureAuth
	I0831 15:36:40.154464    3744 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:36:40.154620    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:36:40.154633    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:40.154762    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:40.154852    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:40.154930    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:40.155003    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:40.155112    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:40.155216    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:40.155334    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:36:40.155341    3744 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:36:40.220781    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:36:40.220794    3744 buildroot.go:70] root file system type: tmpfs
	I0831 15:36:40.220873    3744 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:36:40.220884    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:40.221013    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:40.221103    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:40.221194    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:40.221272    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:40.221400    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:40.221546    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:36:40.221589    3744 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:36:40.298646    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:36:40.298663    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:40.298789    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:40.298885    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:40.298979    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:40.299063    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:40.299201    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:40.299341    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:36:40.299353    3744 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:36:41.956479    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:36:41.956495    3744 machine.go:96] duration metric: took 13.1856235s to provisionDockerMachine
	I0831 15:36:41.956502    3744 start.go:293] postStartSetup for "ha-949000-m02" (driver="hyperkit")
	I0831 15:36:41.956508    3744 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:36:41.956522    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:41.956703    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:36:41.956716    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:41.956812    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:41.956896    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:41.956992    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:41.957077    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:36:42.000050    3744 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:36:42.004306    3744 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:36:42.004318    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:36:42.004439    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:36:42.004572    3744 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:36:42.004578    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:36:42.004735    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:36:42.017617    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:36:42.041071    3744 start.go:296] duration metric: took 84.560659ms for postStartSetup
	I0831 15:36:42.041107    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:42.041300    3744 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0831 15:36:42.041313    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:42.041398    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:42.041504    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:42.041609    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:42.041700    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:36:42.081048    3744 machine.go:197] restoring vm config from /var/lib/minikube/backup: [etc]
	I0831 15:36:42.081113    3744 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0831 15:36:42.134445    3744 fix.go:56] duration metric: took 13.472828598s for fixHost
	I0831 15:36:42.134470    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:42.134618    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:42.134730    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:42.134822    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:42.134900    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:42.135030    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:42.135170    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:36:42.135178    3744 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:36:42.199131    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143802.088359974
	
	I0831 15:36:42.199142    3744 fix.go:216] guest clock: 1725143802.088359974
	I0831 15:36:42.199147    3744 fix.go:229] Guest: 2024-08-31 15:36:42.088359974 -0700 PDT Remote: 2024-08-31 15:36:42.13446 -0700 PDT m=+32.407831620 (delta=-46.100026ms)
	I0831 15:36:42.199164    3744 fix.go:200] guest clock delta is within tolerance: -46.100026ms
	I0831 15:36:42.199169    3744 start.go:83] releasing machines lock for "ha-949000-m02", held for 13.537577271s
	I0831 15:36:42.199184    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:42.199330    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:36:42.220967    3744 out.go:177] * Found network options:
	I0831 15:36:42.242795    3744 out.go:177]   - NO_PROXY=192.169.0.5
	W0831 15:36:42.265056    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:36:42.265093    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:42.265983    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:42.266241    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:42.266370    3744 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:36:42.266410    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	W0831 15:36:42.266454    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:36:42.266575    3744 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:36:42.266625    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:42.266633    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:42.266836    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:42.266871    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:42.267025    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:42.267062    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:42.267162    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:42.267189    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:36:42.267302    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	W0831 15:36:42.303842    3744 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:36:42.303902    3744 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:36:42.349152    3744 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:36:42.349174    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:36:42.349280    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:36:42.365129    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:36:42.373393    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:36:42.381789    3744 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:36:42.381831    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:36:42.389963    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:36:42.398325    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:36:42.406574    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:36:42.414917    3744 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:36:42.423513    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:36:42.431936    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:36:42.440352    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:36:42.449208    3744 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:36:42.457008    3744 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:36:42.464909    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:42.567905    3744 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:36:42.588297    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:36:42.588366    3744 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:36:42.602440    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:36:42.618217    3744 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:36:42.633678    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:36:42.645147    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:36:42.656120    3744 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:36:42.679235    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:36:42.690584    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:36:42.706263    3744 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:36:42.709220    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:36:42.717254    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:36:42.730693    3744 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:36:42.826051    3744 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:36:42.930594    3744 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:36:42.930623    3744 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:36:42.944719    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:43.038034    3744 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:36:45.352340    3744 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.314261795s)
	I0831 15:36:45.352402    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:36:45.362569    3744 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:36:45.374992    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:36:45.385146    3744 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:36:45.481701    3744 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:36:45.590417    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:45.703387    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:36:45.717135    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:36:45.728130    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:45.822749    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:36:45.893539    3744 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:36:45.893614    3744 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:36:45.898396    3744 start.go:563] Will wait 60s for crictl version
	I0831 15:36:45.898450    3744 ssh_runner.go:195] Run: which crictl
	I0831 15:36:45.901472    3744 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:36:45.929873    3744 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:36:45.929947    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:36:45.947410    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:36:45.987982    3744 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:36:46.029879    3744 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:36:46.051790    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:36:46.052207    3744 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:36:46.056767    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:36:46.066419    3744 mustload.go:65] Loading cluster: ha-949000
	I0831 15:36:46.066592    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:36:46.066799    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:36:46.066820    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:36:46.075457    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51846
	I0831 15:36:46.075806    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:36:46.076162    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:36:46.076180    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:36:46.076408    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:36:46.076531    3744 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:36:46.076614    3744 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:36:46.076682    3744 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 3756
	I0831 15:36:46.077630    3744 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:36:46.077872    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:36:46.077895    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:36:46.086285    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51848
	I0831 15:36:46.086630    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:36:46.086945    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:36:46.086955    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:36:46.087205    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:36:46.087313    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:46.087418    3744 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.6
	I0831 15:36:46.087426    3744 certs.go:194] generating shared ca certs ...
	I0831 15:36:46.087439    3744 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:36:46.087575    3744 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:36:46.087627    3744 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:36:46.087636    3744 certs.go:256] generating profile certs ...
	I0831 15:36:46.087739    3744 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:36:46.087826    3744 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.e26aa346
	I0831 15:36:46.087882    3744 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:36:46.087890    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:36:46.087915    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:36:46.087944    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:36:46.087962    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:36:46.087979    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:36:46.087997    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:36:46.088015    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:36:46.088032    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:36:46.088113    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:36:46.088150    3744 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:36:46.088158    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:36:46.088191    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:36:46.088226    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:36:46.088254    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:36:46.088317    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:36:46.088349    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:46.088368    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:36:46.088390    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:36:46.088420    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:46.088505    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:46.088596    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:46.088688    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:46.088763    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:36:46.117725    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0831 15:36:46.121346    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0831 15:36:46.129782    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0831 15:36:46.133012    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0831 15:36:46.141510    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0831 15:36:46.144605    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0831 15:36:46.152913    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0831 15:36:46.156010    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1679 bytes)
	I0831 15:36:46.165156    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0831 15:36:46.168250    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0831 15:36:46.176838    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0831 15:36:46.179929    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0831 15:36:46.189075    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:36:46.209492    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:36:46.229359    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:36:46.249285    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:36:46.268964    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0831 15:36:46.288566    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0831 15:36:46.308035    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:36:46.327968    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:36:46.347874    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:36:46.367538    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:36:46.387135    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:36:46.406841    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0831 15:36:46.420747    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0831 15:36:46.434267    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0831 15:36:46.447929    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1679 bytes)
	I0831 15:36:46.461487    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0831 15:36:46.475040    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0831 15:36:46.488728    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0831 15:36:46.502198    3744 ssh_runner.go:195] Run: openssl version
	I0831 15:36:46.506532    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:36:46.514857    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:36:46.518202    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:36:46.518240    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:36:46.522435    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:36:46.530730    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:36:46.538900    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:36:46.542200    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:36:46.542233    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:36:46.546382    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:36:46.554646    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:36:46.562775    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:46.566092    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:46.566127    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:46.570335    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:36:46.578778    3744 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:36:46.582068    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0831 15:36:46.586501    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0831 15:36:46.590751    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0831 15:36:46.594979    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0831 15:36:46.599120    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0831 15:36:46.603290    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0831 15:36:46.607503    3744 kubeadm.go:934] updating node {m02 192.169.0.6 8443 v1.31.0 docker true true} ...
	I0831 15:36:46.607561    3744 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.6
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:36:46.607581    3744 kube-vip.go:115] generating kube-vip config ...
	I0831 15:36:46.607619    3744 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:36:46.620005    3744 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:36:46.620042    3744 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:36:46.620097    3744 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:36:46.627507    3744 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:36:46.627555    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0831 15:36:46.634842    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:36:46.648529    3744 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:36:46.661781    3744 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:36:46.675402    3744 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:36:46.678250    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:36:46.687467    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:46.779379    3744 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:36:46.793112    3744 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:36:46.793294    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:36:46.814624    3744 out.go:177] * Verifying Kubernetes components...
	I0831 15:36:46.835323    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:46.948649    3744 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:36:46.960452    3744 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:36:46.960657    3744 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0xfc63c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:36:46.960690    3744 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:36:46.960842    3744 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m02" to be "Ready" ...
	I0831 15:36:46.960927    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:46.960932    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:46.960940    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:46.960943    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.801259    3744 round_trippers.go:574] Response Status: 200 OK in 8840 milliseconds
	I0831 15:36:55.802034    3744 node_ready.go:49] node "ha-949000-m02" has status "Ready":"True"
	I0831 15:36:55.802046    3744 node_ready.go:38] duration metric: took 8.841092254s for node "ha-949000-m02" to be "Ready" ...
	I0831 15:36:55.802051    3744 pod_ready.go:36] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:36:55.802085    3744 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I0831 15:36:55.802094    3744 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I0831 15:36:55.802131    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:36:55.802136    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.802142    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.802147    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.817181    3744 round_trippers.go:574] Response Status: 200 OK in 15 milliseconds
	I0831 15:36:55.823106    3744 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.823166    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:36:55.823172    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.823178    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.823182    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.833336    3744 round_trippers.go:574] Response Status: 200 OK in 10 milliseconds
	I0831 15:36:55.833806    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:36:55.833817    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.833824    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.833829    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.843262    3744 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0831 15:36:55.843562    3744 pod_ready.go:93] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"True"
	I0831 15:36:55.843572    3744 pod_ready.go:82] duration metric: took 20.449445ms for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.843595    3744 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.843648    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-snq8s
	I0831 15:36:55.843655    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.843662    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.843667    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.846571    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:36:55.846969    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:36:55.846976    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.846982    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.846985    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.848597    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:55.848912    3744 pod_ready.go:93] pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace has status "Ready":"True"
	I0831 15:36:55.848921    3744 pod_ready.go:82] duration metric: took 5.319208ms for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.848934    3744 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.848970    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000
	I0831 15:36:55.848975    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.848981    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.848985    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.850738    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:55.851195    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:36:55.851203    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.851209    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.851212    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.852625    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:55.853038    3744 pod_ready.go:93] pod "etcd-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:36:55.853047    3744 pod_ready.go:82] duration metric: took 4.107015ms for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.853053    3744 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.853087    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m02
	I0831 15:36:55.853092    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.853100    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.853104    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.854440    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:55.854845    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:55.854852    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.854858    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.854861    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.856182    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:55.856534    3744 pod_ready.go:93] pod "etcd-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:36:55.856542    3744 pod_ready.go:82] duration metric: took 3.483952ms for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.856548    3744 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.856578    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m03
	I0831 15:36:55.856582    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.856588    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.856592    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.858303    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:56.003107    3744 request.go:632] Waited for 144.429757ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:36:56.003176    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:36:56.003183    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:56.003189    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:56.003193    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:56.004813    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:56.005140    3744 pod_ready.go:93] pod "etcd-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:36:56.005149    3744 pod_ready.go:82] duration metric: took 148.59533ms for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:56.005160    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:56.202344    3744 request.go:632] Waited for 197.12667ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:36:56.202386    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:36:56.202417    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:56.202425    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:56.202428    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:56.205950    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:56.403821    3744 request.go:632] Waited for 197.364477ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:36:56.403986    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:36:56.403997    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:56.404008    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:56.404017    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:56.407269    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:56.407644    3744 pod_ready.go:98] node "ha-949000" hosting pod "kube-apiserver-ha-949000" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-949000" has status "Ready":"False"
	I0831 15:36:56.407658    3744 pod_ready.go:82] duration metric: took 402.487822ms for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	E0831 15:36:56.407673    3744 pod_ready.go:67] WaitExtra: waitPodCondition: node "ha-949000" hosting pod "kube-apiserver-ha-949000" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-949000" has status "Ready":"False"
	I0831 15:36:56.407681    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:56.602890    3744 request.go:632] Waited for 195.157951ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:36:56.602980    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:36:56.602991    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:56.603003    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:56.603010    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:56.606100    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:56.802222    3744 request.go:632] Waited for 195.71026ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:56.802289    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:56.802295    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:56.802301    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:56.802305    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:56.804612    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:36:56.804914    3744 pod_ready.go:93] pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:36:56.804923    3744 pod_ready.go:82] duration metric: took 397.232028ms for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:56.804930    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:57.003522    3744 request.go:632] Waited for 198.554376ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:36:57.003559    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:36:57.003600    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:57.003608    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:57.003618    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:57.005675    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:36:57.203456    3744 request.go:632] Waited for 197.402218ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:36:57.203520    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:36:57.203526    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:57.203532    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:57.203537    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:57.206124    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:36:57.206516    3744 pod_ready.go:93] pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:36:57.206526    3744 pod_ready.go:82] duration metric: took 401.586021ms for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:57.206534    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:57.402973    3744 request.go:632] Waited for 196.400032ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:36:57.403011    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:36:57.403017    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:57.403051    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:57.403056    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:57.405260    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:36:57.603636    3744 request.go:632] Waited for 197.987151ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:36:57.603708    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:36:57.603713    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:57.603719    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:57.603724    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:57.606022    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:36:57.606364    3744 pod_ready.go:98] node "ha-949000" hosting pod "kube-controller-manager-ha-949000" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-949000" has status "Ready":"False"
	I0831 15:36:57.606376    3744 pod_ready.go:82] duration metric: took 399.83214ms for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	E0831 15:36:57.606383    3744 pod_ready.go:67] WaitExtra: waitPodCondition: node "ha-949000" hosting pod "kube-controller-manager-ha-949000" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-949000" has status "Ready":"False"
	I0831 15:36:57.606388    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:57.802885    3744 request.go:632] Waited for 196.449707ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:36:57.803017    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:36:57.803028    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:57.803039    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:57.803046    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:57.806339    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:58.003449    3744 request.go:632] Waited for 196.421818ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:58.003513    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:58.003518    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:58.003524    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:58.003527    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:58.005621    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:36:58.203691    3744 request.go:632] Waited for 95.498322ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:36:58.203749    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:36:58.203758    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:58.203763    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:58.203766    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:58.207046    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:58.403784    3744 request.go:632] Waited for 196.241368ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:58.403948    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:58.403963    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:58.403974    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:58.404010    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:58.407767    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:58.608224    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:36:58.608245    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:58.608257    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:58.608265    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:58.611367    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:58.802284    3744 request.go:632] Waited for 190.220665ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:58.802382    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:58.802393    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:58.802407    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:58.802421    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:58.806173    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:59.108214    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:36:59.108238    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:59.108248    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:59.108332    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:59.111913    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:59.202533    3744 request.go:632] Waited for 89.639104ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:59.202672    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:59.202684    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:59.202693    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:59.202700    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:59.205790    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:59.608244    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:36:59.608308    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:59.608333    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:59.608346    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:59.611536    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:59.612038    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:59.612050    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:59.612056    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:59.612059    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:59.613486    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:59.613797    3744 pod_ready.go:103] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:00.108234    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:00.108258    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:00.108269    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:00.108276    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:00.112243    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:00.112803    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:00.112811    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:00.112816    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:00.112819    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:00.114922    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:00.608266    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:00.608291    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:00.608340    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:00.608348    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:00.611571    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:00.612033    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:00.612041    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:00.612047    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:00.612051    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:00.614268    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:01.108244    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:01.108270    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:01.108282    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:01.108287    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:01.112176    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:01.112688    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:01.112697    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:01.112703    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:01.112706    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:01.114756    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:01.608252    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:01.608269    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:01.608303    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:01.608308    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:01.610548    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:01.610932    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:01.610940    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:01.610946    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:01.610951    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:01.612574    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:02.108349    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:02.108375    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:02.108386    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:02.108392    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:02.111907    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:02.112645    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:02.112653    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:02.112658    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:02.112662    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:02.114143    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:02.114439    3744 pod_ready.go:103] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:02.608228    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:02.608245    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:02.608252    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:02.608256    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:02.610772    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:02.611191    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:02.611199    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:02.611206    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:02.611210    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:02.613037    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:03.108219    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:03.108235    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:03.108241    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:03.108250    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:03.111668    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:03.112196    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:03.112204    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:03.112211    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:03.112214    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:03.114279    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:03.608402    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:03.608463    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:03.608509    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:03.608524    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:03.611720    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:03.612413    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:03.612424    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:03.612432    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:03.612436    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:03.615410    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:04.108309    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:04.108328    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:04.108337    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:04.108341    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:04.115334    3744 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:37:04.115796    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:04.115804    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:04.115815    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:04.115818    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:04.122611    3744 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:37:04.122876    3744 pod_ready.go:103] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:04.608750    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:04.608825    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:04.608840    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:04.608846    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:04.612925    3744 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:37:04.613492    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:04.613499    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:04.613505    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:04.613509    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:04.614977    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:05.106817    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:05.106842    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:05.106852    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:05.106859    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:05.110466    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:05.111095    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:05.111106    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:05.111113    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:05.111117    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:05.112615    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:05.608187    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:05.608211    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:05.608224    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:05.608248    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:05.611732    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:05.612260    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:05.612270    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:05.612278    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:05.612284    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:05.614120    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:06.107506    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:06.107527    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:06.107540    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:06.107545    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:06.110547    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:06.111218    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:06.111229    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:06.111237    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:06.111242    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:06.112971    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:06.607368    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:06.607380    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:06.607386    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:06.607391    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:06.609787    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:06.610207    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:06.610215    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:06.610221    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:06.610224    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:06.611989    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:06.612289    3744 pod_ready.go:103] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:07.107726    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:07.107744    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:07.107773    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:07.107777    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:07.109482    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:07.109930    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:07.109937    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:07.109943    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:07.109947    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:07.111448    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:07.607689    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:07.607742    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:07.607753    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:07.607759    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:07.610882    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:07.611345    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:07.611353    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:07.611359    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:07.611369    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:07.613392    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:08.107409    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:08.107435    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:08.107446    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:08.107451    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:08.111199    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:08.111808    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:08.111815    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:08.111820    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:08.111825    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:08.113569    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:08.607450    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:08.607477    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:08.607489    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:08.607494    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:08.611034    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:08.611547    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:08.611557    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:08.611563    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:08.611568    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:08.613347    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:08.613756    3744 pod_ready.go:103] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:09.108698    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:09.108730    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:09.108778    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:09.108791    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:09.112115    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:09.112783    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:09.112791    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:09.112796    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:09.112803    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:09.114417    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:09.606780    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:09.606804    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:09.606816    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:09.606824    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:09.609915    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:09.610481    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:09.610488    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:09.610494    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:09.610497    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:09.612172    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:10.106727    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:10.106745    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:10.106779    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:10.106786    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:10.109423    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:10.109937    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:10.109944    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:10.109950    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:10.109953    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:10.111717    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:10.607619    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:10.607642    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:10.607653    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:10.607658    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:10.610928    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:10.611460    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:10.611467    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:10.611472    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:10.611475    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:10.613024    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:11.108825    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:11.108848    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:11.108859    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:11.108865    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:11.112708    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:11.113184    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:11.113195    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:11.113202    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:11.113207    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:11.115187    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:11.116261    3744 pod_ready.go:103] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:11.607215    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:11.607243    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:11.607254    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:11.607261    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:11.611037    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:11.611547    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:11.611557    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:11.611565    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:11.611569    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:11.613373    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.108739    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:12.108764    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.108774    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.108779    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.112484    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:12.113117    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:12.113125    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.113131    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.113135    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.114878    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.608099    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:12.608124    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.608133    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.608140    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.611866    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:12.612563    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:12.612571    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.612577    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.612581    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.614297    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.614794    3744 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:12.614803    3744 pod_ready.go:82] duration metric: took 15.008248116s for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.614810    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.614849    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:37:12.614854    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.614860    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.614864    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.617726    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:12.618084    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:12.618092    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.618097    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.618100    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.619622    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.620160    3744 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:12.620169    3744 pod_ready.go:82] duration metric: took 5.352553ms for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.620175    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.620212    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:37:12.620217    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.620222    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.620225    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.624634    3744 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:37:12.625059    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:12.625066    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.625071    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.625074    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.626559    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.626901    3744 pod_ready.go:93] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:12.626910    3744 pod_ready.go:82] duration metric: took 6.729281ms for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.626916    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.626951    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-d45q5
	I0831 15:37:12.626956    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.626961    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.626964    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.628480    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.628945    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:12.628956    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.628961    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.628965    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.630425    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.630760    3744 pod_ready.go:93] pod "kube-proxy-d45q5" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:12.630769    3744 pod_ready.go:82] duration metric: took 3.847336ms for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.630775    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.630807    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:12.630812    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.630817    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.630821    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.632536    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.633060    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:12.633067    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.633072    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.633077    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.634424    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:13.132549    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:13.132573    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:13.132585    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:13.132591    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:13.135680    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:13.136120    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:13.136128    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:13.136133    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:13.136137    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:13.137931    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:13.632454    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:13.632468    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:13.632474    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:13.632477    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:13.634478    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:13.634979    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:13.634987    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:13.634992    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:13.634997    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:13.636493    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:14.132750    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:14.132776    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:14.132788    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:14.132794    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:14.136342    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:14.136985    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:14.136993    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:14.136999    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:14.137002    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:14.139021    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:14.630998    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:14.631010    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:14.631017    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:14.631019    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:14.637296    3744 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:37:14.637754    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:14.637761    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:14.637767    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:14.637770    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:14.645976    3744 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0831 15:37:14.646303    3744 pod_ready.go:103] pod "kube-proxy-q7ndn" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:15.131375    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:15.131389    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:15.131395    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:15.131398    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:15.136989    3744 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0831 15:37:15.137543    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:15.137552    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:15.137557    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:15.137561    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:15.145480    3744 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0831 15:37:15.631037    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:15.631049    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:15.631056    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:15.631060    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:15.650939    3744 round_trippers.go:574] Response Status: 200 OK in 19 milliseconds
	I0831 15:37:15.657344    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:15.657354    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:15.657360    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:15.657363    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:15.664319    3744 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:37:16.131044    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:16.131056    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:16.131062    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:16.131065    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:16.133359    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:16.133806    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:16.133815    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:16.133821    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:16.133835    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:16.135405    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:16.631836    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:16.631848    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:16.631854    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:16.631858    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:16.633942    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:16.634428    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:16.634436    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:16.634442    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:16.634449    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:16.636230    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:17.131746    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:17.131800    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.131814    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.131820    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.135452    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:17.136132    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:17.136139    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.136145    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.136148    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.137779    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:17.138135    3744 pod_ready.go:93] pod "kube-proxy-q7ndn" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:17.138143    3744 pod_ready.go:82] duration metric: took 4.507315671s for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:17.138150    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:17.138183    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:37:17.138187    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.138193    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.138198    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.140005    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:17.140372    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:17.140380    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.140385    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.140388    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.142052    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:17.142371    3744 pod_ready.go:93] pod "kube-scheduler-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:17.142380    3744 pod_ready.go:82] duration metric: took 4.22523ms for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:17.142387    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:17.142420    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:37:17.142425    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.142430    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.142433    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.144162    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:17.144573    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:17.144580    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.144585    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.144591    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.146052    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:17.146407    3744 pod_ready.go:93] pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:17.146415    3744 pod_ready.go:82] duration metric: took 4.022752ms for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:17.146422    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:17.208351    3744 request.go:632] Waited for 61.893937ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:37:17.208418    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:37:17.208435    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.208444    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.208449    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.211070    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:17.408566    3744 request.go:632] Waited for 197.051034ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:17.408606    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:17.408614    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.408622    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.408627    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.410767    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:17.411178    3744 pod_ready.go:93] pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:17.411187    3744 pod_ready.go:82] duration metric: took 264.75731ms for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:17.411194    3744 pod_ready.go:39] duration metric: took 21.608904421s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:37:17.411208    3744 api_server.go:52] waiting for apiserver process to appear ...
	I0831 15:37:17.411260    3744 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:37:17.423683    3744 api_server.go:72] duration metric: took 30.630215512s to wait for apiserver process to appear ...
	I0831 15:37:17.423694    3744 api_server.go:88] waiting for apiserver healthz status ...
	I0831 15:37:17.423707    3744 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0831 15:37:17.427947    3744 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0831 15:37:17.427987    3744 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0831 15:37:17.427992    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.427998    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.428008    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.428562    3744 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 15:37:17.428682    3744 api_server.go:141] control plane version: v1.31.0
	I0831 15:37:17.428691    3744 api_server.go:131] duration metric: took 4.99355ms to wait for apiserver health ...
	I0831 15:37:17.428699    3744 system_pods.go:43] waiting for kube-system pods to appear ...
	I0831 15:37:17.609319    3744 request.go:632] Waited for 180.546017ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:37:17.609356    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:37:17.609364    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.609372    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.609378    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.615729    3744 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:37:17.620529    3744 system_pods.go:59] 24 kube-system pods found
	I0831 15:37:17.620549    3744 system_pods.go:61] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0831 15:37:17.620557    3744 system_pods.go:61] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0831 15:37:17.620562    3744 system_pods.go:61] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:37:17.620566    3744 system_pods.go:61] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:37:17.620569    3744 system_pods.go:61] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:37:17.620572    3744 system_pods.go:61] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:37:17.620577    3744 system_pods.go:61] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:37:17.620581    3744 system_pods.go:61] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:37:17.620583    3744 system_pods.go:61] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:37:17.620586    3744 system_pods.go:61] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:37:17.620588    3744 system_pods.go:61] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:37:17.620593    3744 system_pods.go:61] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I0831 15:37:17.620596    3744 system_pods.go:61] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:37:17.620599    3744 system_pods.go:61] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:37:17.620602    3744 system_pods.go:61] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:37:17.620605    3744 system_pods.go:61] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:37:17.620607    3744 system_pods.go:61] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:37:17.620610    3744 system_pods.go:61] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:37:17.620612    3744 system_pods.go:61] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:37:17.620615    3744 system_pods.go:61] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:37:17.620617    3744 system_pods.go:61] "kube-vip-ha-949000" [98967a2c-6641-4193-b7ce-c0fbdee58344] Running
	I0831 15:37:17.620620    3744 system_pods.go:61] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:37:17.620622    3744 system_pods.go:61] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:37:17.620625    3744 system_pods.go:61] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:37:17.620628    3744 system_pods.go:74] duration metric: took 191.923916ms to wait for pod list to return data ...
	I0831 15:37:17.620634    3744 default_sa.go:34] waiting for default service account to be created ...
	I0831 15:37:17.808285    3744 request.go:632] Waited for 187.597884ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:37:17.808399    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:37:17.808411    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.808422    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.808429    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.812254    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:17.812385    3744 default_sa.go:45] found service account: "default"
	I0831 15:37:17.812394    3744 default_sa.go:55] duration metric: took 191.75371ms for default service account to be created ...
	I0831 15:37:17.812410    3744 system_pods.go:116] waiting for k8s-apps to be running ...
	I0831 15:37:18.009398    3744 request.go:632] Waited for 196.900555ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:37:18.009462    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:37:18.009503    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:18.009518    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:18.009526    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:18.017075    3744 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0831 15:37:18.022069    3744 system_pods.go:86] 24 kube-system pods found
	I0831 15:37:18.022087    3744 system_pods.go:89] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0831 15:37:18.022093    3744 system_pods.go:89] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0831 15:37:18.022097    3744 system_pods.go:89] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:37:18.022101    3744 system_pods.go:89] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:37:18.022105    3744 system_pods.go:89] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:37:18.022108    3744 system_pods.go:89] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:37:18.022111    3744 system_pods.go:89] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:37:18.022114    3744 system_pods.go:89] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:37:18.022117    3744 system_pods.go:89] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:37:18.022120    3744 system_pods.go:89] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:37:18.022123    3744 system_pods.go:89] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:37:18.022127    3744 system_pods.go:89] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I0831 15:37:18.022131    3744 system_pods.go:89] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:37:18.022134    3744 system_pods.go:89] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:37:18.022138    3744 system_pods.go:89] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:37:18.022140    3744 system_pods.go:89] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:37:18.022143    3744 system_pods.go:89] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:37:18.022146    3744 system_pods.go:89] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:37:18.022148    3744 system_pods.go:89] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:37:18.022152    3744 system_pods.go:89] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:37:18.022155    3744 system_pods.go:89] "kube-vip-ha-949000" [98967a2c-6641-4193-b7ce-c0fbdee58344] Running
	I0831 15:37:18.022157    3744 system_pods.go:89] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:37:18.022160    3744 system_pods.go:89] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:37:18.022162    3744 system_pods.go:89] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:37:18.022168    3744 system_pods.go:126] duration metric: took 209.74863ms to wait for k8s-apps to be running ...
	I0831 15:37:18.022173    3744 system_svc.go:44] waiting for kubelet service to be running ....
	I0831 15:37:18.022230    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:37:18.033610    3744 system_svc.go:56] duration metric: took 11.428501ms WaitForService to wait for kubelet
	I0831 15:37:18.033632    3744 kubeadm.go:582] duration metric: took 31.24015665s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:37:18.033647    3744 node_conditions.go:102] verifying NodePressure condition ...
	I0831 15:37:18.208845    3744 request.go:632] Waited for 175.149396ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0831 15:37:18.208908    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0831 15:37:18.208914    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:18.208921    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:18.208926    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:18.213884    3744 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:37:18.214480    3744 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:37:18.214495    3744 node_conditions.go:123] node cpu capacity is 2
	I0831 15:37:18.214504    3744 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:37:18.214507    3744 node_conditions.go:123] node cpu capacity is 2
	I0831 15:37:18.214510    3744 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:37:18.214513    3744 node_conditions.go:123] node cpu capacity is 2
	I0831 15:37:18.214516    3744 node_conditions.go:105] duration metric: took 180.864612ms to run NodePressure ...
	I0831 15:37:18.214525    3744 start.go:241] waiting for startup goroutines ...
	I0831 15:37:18.214542    3744 start.go:255] writing updated cluster config ...
	I0831 15:37:18.235038    3744 out.go:201] 
	I0831 15:37:18.272074    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:37:18.272141    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:37:18.293920    3744 out.go:177] * Starting "ha-949000-m03" control-plane node in "ha-949000" cluster
	I0831 15:37:18.336055    3744 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:37:18.336091    3744 cache.go:56] Caching tarball of preloaded images
	I0831 15:37:18.336291    3744 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:37:18.336317    3744 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:37:18.336472    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:37:18.337744    3744 start.go:360] acquireMachinesLock for ha-949000-m03: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:37:18.337863    3744 start.go:364] duration metric: took 91.481µs to acquireMachinesLock for "ha-949000-m03"
	I0831 15:37:18.337896    3744 start.go:96] Skipping create...Using existing machine configuration
	I0831 15:37:18.337907    3744 fix.go:54] fixHost starting: m03
	I0831 15:37:18.338304    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:37:18.338331    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:37:18.347585    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51853
	I0831 15:37:18.347933    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:37:18.348309    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:37:18.348325    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:37:18.348554    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:37:18.348680    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:18.348764    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetState
	I0831 15:37:18.348835    3744 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:37:18.348927    3744 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:37:18.349821    3744 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid 3227 missing from process table
	I0831 15:37:18.349851    3744 fix.go:112] recreateIfNeeded on ha-949000-m03: state=Stopped err=<nil>
	I0831 15:37:18.349859    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	W0831 15:37:18.349928    3744 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 15:37:18.371074    3744 out.go:177] * Restarting existing hyperkit VM for "ha-949000-m03" ...
	I0831 15:37:18.413086    3744 main.go:141] libmachine: (ha-949000-m03) Calling .Start
	I0831 15:37:18.413447    3744 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:37:18.413507    3744 main.go:141] libmachine: (ha-949000-m03) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid
	I0831 15:37:18.415280    3744 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid 3227 missing from process table
	I0831 15:37:18.415294    3744 main.go:141] libmachine: (ha-949000-m03) DBG | pid 3227 is in state "Stopped"
	I0831 15:37:18.415313    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid...
	I0831 15:37:18.415660    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Using UUID 3fdefe95-7552-4d5b-8412-6ae6e5c787bb
	I0831 15:37:18.441752    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Generated MAC fa:59:9e:3b:35:6d
	I0831 15:37:18.441781    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:37:18.441964    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"3fdefe95-7552-4d5b-8412-6ae6e5c787bb", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00037b4a0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:37:18.442001    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"3fdefe95-7552-4d5b-8412-6ae6e5c787bb", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00037b4a0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:37:18.442067    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "3fdefe95-7552-4d5b-8412-6ae6e5c787bb", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/ha-949000-m03.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:37:18.442136    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 3fdefe95-7552-4d5b-8412-6ae6e5c787bb -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/ha-949000-m03.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:37:18.442155    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:37:18.443921    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 DEBUG: hyperkit: Pid is 3783
	I0831 15:37:18.444292    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 0
	I0831 15:37:18.444304    3744 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:37:18.444362    3744 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3783
	I0831 15:37:18.446124    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:37:18.446228    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0831 15:37:18.446248    3744 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ec75}
	I0831 15:37:18.446260    3744 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4ec63}
	I0831 15:37:18.446272    3744 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d4eb85}
	I0831 15:37:18.446306    3744 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d4eb32}
	I0831 15:37:18.446321    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Found match: fa:59:9e:3b:35:6d
	I0831 15:37:18.446335    3744 main.go:141] libmachine: (ha-949000-m03) DBG | IP: 192.169.0.7
	I0831 15:37:18.446363    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetConfigRaw
	I0831 15:37:18.447082    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:37:18.447293    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:37:18.447693    3744 machine.go:93] provisionDockerMachine start ...
	I0831 15:37:18.447703    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:18.447827    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:18.447958    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:18.448072    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:18.448161    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:18.448250    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:18.448355    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:37:18.448517    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:37:18.448526    3744 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 15:37:18.451810    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:37:18.461189    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:37:18.462060    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:37:18.462072    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:37:18.462081    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:37:18.462086    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:37:18.852728    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:37:18.852743    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:37:18.968113    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:37:18.968132    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:37:18.968140    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:37:18.968171    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:37:18.968968    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:37:18.968978    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:37:24.540624    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:24 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0831 15:37:24.540682    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:24 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0831 15:37:24.540695    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:24 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0831 15:37:24.565460    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:24 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0831 15:37:29.520863    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 15:37:29.520878    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:37:29.521004    3744 buildroot.go:166] provisioning hostname "ha-949000-m03"
	I0831 15:37:29.521015    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:37:29.521111    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:29.521203    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:29.521290    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.521386    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.521482    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:29.521612    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:37:29.521765    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:37:29.521776    3744 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m03 && echo "ha-949000-m03" | sudo tee /etc/hostname
	I0831 15:37:29.591531    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m03
	
	I0831 15:37:29.591551    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:29.591708    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:29.591803    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.591884    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.591995    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:29.592173    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:37:29.592330    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:37:29.592341    3744 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m03' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m03/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m03' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:37:29.658685    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:37:29.658701    3744 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:37:29.658714    3744 buildroot.go:174] setting up certificates
	I0831 15:37:29.658720    3744 provision.go:84] configureAuth start
	I0831 15:37:29.658727    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:37:29.658867    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:37:29.658966    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:29.659054    3744 provision.go:143] copyHostCerts
	I0831 15:37:29.659089    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:37:29.659140    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:37:29.659146    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:37:29.659263    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:37:29.659455    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:37:29.659484    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:37:29.659488    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:37:29.659564    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:37:29.659714    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:37:29.659747    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:37:29.659753    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:37:29.659818    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:37:29.659964    3744 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m03 san=[127.0.0.1 192.169.0.7 ha-949000-m03 localhost minikube]
	I0831 15:37:29.736089    3744 provision.go:177] copyRemoteCerts
	I0831 15:37:29.736163    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:37:29.736179    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:29.736322    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:29.736416    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.736504    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:29.736597    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:37:29.771590    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:37:29.771658    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:37:29.791254    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:37:29.791326    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:37:29.810923    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:37:29.810991    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0831 15:37:29.830631    3744 provision.go:87] duration metric: took 171.900577ms to configureAuth
	I0831 15:37:29.830645    3744 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:37:29.830811    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:37:29.830824    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:29.830954    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:29.831042    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:29.831126    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.831207    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.831289    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:29.831399    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:37:29.831522    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:37:29.831530    3744 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:37:29.892205    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:37:29.892217    3744 buildroot.go:70] root file system type: tmpfs
	I0831 15:37:29.892291    3744 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:37:29.892302    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:29.892426    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:29.892516    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.892609    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.892714    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:29.892838    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:37:29.892976    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:37:29.893022    3744 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:37:29.961258    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:37:29.961276    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:29.961414    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:29.961511    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.961619    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.961703    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:29.961817    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:37:29.961955    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:37:29.961967    3744 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:37:31.615783    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:37:31.615799    3744 machine.go:96] duration metric: took 13.167957184s to provisionDockerMachine
	I0831 15:37:31.615806    3744 start.go:293] postStartSetup for "ha-949000-m03" (driver="hyperkit")
	I0831 15:37:31.615814    3744 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:37:31.615823    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:31.616028    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:37:31.616046    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:31.616158    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:31.616258    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:31.616349    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:31.616481    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:37:31.654537    3744 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:37:31.657860    3744 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:37:31.657873    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:37:31.657960    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:37:31.658093    3744 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:37:31.658099    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:37:31.658258    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:37:31.672215    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:37:31.694606    3744 start.go:296] duration metric: took 78.79067ms for postStartSetup
	I0831 15:37:31.694628    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:31.694811    3744 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0831 15:37:31.694825    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:31.694916    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:31.695011    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:31.695099    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:31.695179    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:37:31.731833    3744 machine.go:197] restoring vm config from /var/lib/minikube/backup: [etc]
	I0831 15:37:31.731896    3744 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0831 15:37:31.763292    3744 fix.go:56] duration metric: took 13.425238964s for fixHost
	I0831 15:37:31.763317    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:31.763450    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:31.763540    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:31.763638    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:31.763730    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:31.763846    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:37:31.764005    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:37:31.764012    3744 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:37:31.823707    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143851.888011101
	
	I0831 15:37:31.823721    3744 fix.go:216] guest clock: 1725143851.888011101
	I0831 15:37:31.823727    3744 fix.go:229] Guest: 2024-08-31 15:37:31.888011101 -0700 PDT Remote: 2024-08-31 15:37:31.763307 -0700 PDT m=+82.036146513 (delta=124.704101ms)
	I0831 15:37:31.823737    3744 fix.go:200] guest clock delta is within tolerance: 124.704101ms
	I0831 15:37:31.823741    3744 start.go:83] releasing machines lock for "ha-949000-m03", held for 13.485720355s
	I0831 15:37:31.823765    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:31.823906    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:37:31.845130    3744 out.go:177] * Found network options:
	I0831 15:37:31.865299    3744 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6
	W0831 15:37:31.886126    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:37:31.886160    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:37:31.886178    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:31.886943    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:31.887142    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:31.887254    3744 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:37:31.887286    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	W0831 15:37:31.887368    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:37:31.887394    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:37:31.887504    3744 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:37:31.887511    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:31.887521    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:31.887696    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:31.887743    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:31.887910    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:31.887987    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:31.888104    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:31.888248    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:37:31.888351    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	W0831 15:37:31.921752    3744 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:37:31.921817    3744 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:37:31.966799    3744 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:37:31.966823    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:37:31.966938    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:37:31.983482    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:37:31.992712    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:37:32.002010    3744 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:37:32.002056    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:37:32.011011    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:37:32.020061    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:37:32.028982    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:37:32.038569    3744 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:37:32.048027    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:37:32.057745    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:37:32.066832    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:37:32.075930    3744 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:37:32.084234    3744 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:37:32.092513    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:37:32.200002    3744 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:37:32.218717    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:37:32.218782    3744 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:37:32.234470    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:37:32.246859    3744 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:37:32.268072    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:37:32.279723    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:37:32.291270    3744 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:37:32.313992    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:37:32.325465    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:37:32.340891    3744 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:37:32.343755    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:37:32.351807    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:37:32.365348    3744 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:37:32.460495    3744 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:37:32.562594    3744 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:37:32.562619    3744 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:37:32.576763    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:37:32.677110    3744 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:37:34.994745    3744 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.317591857s)
	I0831 15:37:34.994823    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:37:35.005392    3744 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:37:35.018138    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:37:35.028648    3744 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:37:35.124983    3744 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:37:35.235732    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:37:35.346302    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:37:35.360082    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:37:35.370959    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:37:35.477096    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:37:35.544102    3744 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:37:35.544184    3744 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:37:35.548776    3744 start.go:563] Will wait 60s for crictl version
	I0831 15:37:35.548834    3744 ssh_runner.go:195] Run: which crictl
	I0831 15:37:35.551795    3744 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:37:35.578659    3744 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:37:35.578734    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:37:35.596206    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:37:35.640045    3744 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:37:35.682013    3744 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:37:35.703018    3744 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0831 15:37:35.723860    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:37:35.724174    3744 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:37:35.728476    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:37:35.738147    3744 mustload.go:65] Loading cluster: ha-949000
	I0831 15:37:35.738335    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:37:35.738551    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:37:35.738572    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:37:35.747642    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51875
	I0831 15:37:35.747990    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:37:35.748302    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:37:35.748315    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:37:35.748544    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:37:35.748655    3744 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:37:35.748733    3744 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:37:35.748808    3744 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 3756
	I0831 15:37:35.749749    3744 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:37:35.749998    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:37:35.750023    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:37:35.758673    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51877
	I0831 15:37:35.758994    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:37:35.759349    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:37:35.759365    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:37:35.759557    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:37:35.759653    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:37:35.759755    3744 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.7
	I0831 15:37:35.759761    3744 certs.go:194] generating shared ca certs ...
	I0831 15:37:35.759770    3744 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:37:35.759913    3744 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:37:35.759965    3744 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:37:35.759974    3744 certs.go:256] generating profile certs ...
	I0831 15:37:35.760073    3744 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:37:35.760161    3744 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.0c0868f3
	I0831 15:37:35.760221    3744 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:37:35.760228    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:37:35.760249    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:37:35.760273    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:37:35.760292    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:37:35.760308    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:37:35.760333    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:37:35.760352    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:37:35.760368    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:37:35.760450    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:37:35.760489    3744 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:37:35.760497    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:37:35.760534    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:37:35.760565    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:37:35.760594    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:37:35.760658    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:37:35.760694    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:37:35.760715    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:37:35.760733    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:37:35.760757    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:37:35.760839    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:37:35.760910    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:37:35.761012    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:37:35.761091    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:37:35.789354    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0831 15:37:35.793275    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0831 15:37:35.801794    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0831 15:37:35.805175    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0831 15:37:35.813194    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0831 15:37:35.816357    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0831 15:37:35.824019    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0831 15:37:35.827176    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1679 bytes)
	I0831 15:37:35.835398    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0831 15:37:35.838546    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0831 15:37:35.847890    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0831 15:37:35.851045    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0831 15:37:35.858866    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:37:35.879287    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:37:35.899441    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:37:35.919810    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:37:35.940109    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0831 15:37:35.960051    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0831 15:37:35.979638    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:37:35.999504    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:37:36.019089    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:37:36.039173    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:37:36.058828    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:37:36.078456    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0831 15:37:36.092789    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0831 15:37:36.106379    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0831 15:37:36.119946    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1679 bytes)
	I0831 15:37:36.133839    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0831 15:37:36.148101    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0831 15:37:36.161739    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0831 15:37:36.175159    3744 ssh_runner.go:195] Run: openssl version
	I0831 15:37:36.179390    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:37:36.187703    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:37:36.191071    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:37:36.191114    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:37:36.195292    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:37:36.203552    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:37:36.212239    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:37:36.215746    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:37:36.215790    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:37:36.219988    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:37:36.228608    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:37:36.237421    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:37:36.240805    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:37:36.240843    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:37:36.245119    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:37:36.253604    3744 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:37:36.256982    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0831 15:37:36.261329    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0831 15:37:36.265579    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0831 15:37:36.269756    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0831 15:37:36.273922    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0831 15:37:36.278236    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0831 15:37:36.282870    3744 kubeadm.go:934] updating node {m03 192.169.0.7 8443 v1.31.0 docker true true} ...
	I0831 15:37:36.282943    3744 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m03 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.7
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:37:36.282961    3744 kube-vip.go:115] generating kube-vip config ...
	I0831 15:37:36.283008    3744 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:37:36.296221    3744 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:37:36.296272    3744 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:37:36.296330    3744 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:37:36.304482    3744 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:37:36.304539    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0831 15:37:36.311975    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:37:36.325288    3744 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:37:36.338951    3744 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:37:36.352501    3744 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:37:36.355411    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:37:36.364926    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:37:36.456418    3744 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:37:36.471558    3744 start.go:235] Will wait 6m0s for node &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:37:36.471752    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:37:36.529525    3744 out.go:177] * Verifying Kubernetes components...
	I0831 15:37:36.550389    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:37:36.691381    3744 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:37:36.709538    3744 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:37:36.709731    3744 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0xfc63c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:37:36.709775    3744 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:37:36.709942    3744 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m03" to be "Ready" ...
	I0831 15:37:36.709989    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:36.709994    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:36.710000    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:36.710003    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:36.712128    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:36.712576    3744 node_ready.go:49] node "ha-949000-m03" has status "Ready":"True"
	I0831 15:37:36.712585    3744 node_ready.go:38] duration metric: took 2.63459ms for node "ha-949000-m03" to be "Ready" ...
	I0831 15:37:36.712591    3744 pod_ready.go:36] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:37:36.712631    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:37:36.712638    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:36.712643    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:36.712650    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:36.716253    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:36.722917    3744 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:36.722974    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:36.722980    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:36.722986    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:36.722989    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:36.725559    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:36.726201    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:36.726209    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:36.726215    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:36.726231    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:36.728257    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:37.223697    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:37.223717    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:37.223728    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:37.223737    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:37.235316    3744 round_trippers.go:574] Response Status: 200 OK in 11 milliseconds
	I0831 15:37:37.236200    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:37.236213    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:37.236221    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:37.236224    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:37.238445    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:37.723177    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:37.723191    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:37.723198    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:37.723201    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:37.730411    3744 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0831 15:37:37.731034    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:37.731043    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:37.731048    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:37.731053    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:37.733549    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:38.223151    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:38.223168    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:38.223174    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:38.223177    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:38.225984    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:38.226378    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:38.226386    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:38.226391    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:38.226394    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:38.229300    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:38.724309    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:38.724325    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:38.724334    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:38.724337    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:38.726908    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:38.727435    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:38.727443    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:38.727449    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:38.727454    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:38.729651    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:38.730063    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:39.223582    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:39.223601    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:39.223608    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:39.223627    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:39.225990    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:39.226495    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:39.226503    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:39.226509    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:39.226514    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:39.228583    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:39.724043    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:39.724057    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:39.724068    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:39.724079    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:39.726325    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:39.726730    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:39.726738    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:39.726744    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:39.726748    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:39.728764    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:40.223977    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:40.223993    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:40.224000    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:40.224004    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:40.226279    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:40.226700    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:40.226708    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:40.226714    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:40.226718    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:40.228516    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:40.724602    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:40.724619    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:40.724628    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:40.724634    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:40.727418    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:40.727959    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:40.727966    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:40.727972    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:40.727983    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:40.729907    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:40.730276    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:41.223101    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:41.223117    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:41.223124    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:41.223128    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:41.225118    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:41.225750    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:41.225761    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:41.225768    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:41.225772    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:41.227757    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:41.724913    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:41.724940    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:41.724951    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:41.725035    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:41.728761    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:41.729240    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:41.729247    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:41.729252    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:41.729255    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:41.730912    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:42.224964    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:42.224989    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:42.225001    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:42.225006    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:42.228620    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:42.229196    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:42.229204    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:42.229210    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:42.229214    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:42.232307    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:42.725079    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:42.725106    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:42.725118    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:42.725128    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:42.728799    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:42.729409    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:42.729420    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:42.729429    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:42.729435    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:42.731172    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:42.731531    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:43.225019    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:43.225047    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:43.225060    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:43.225067    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:43.228808    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:43.229389    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:43.229399    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:43.229405    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:43.229409    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:43.231056    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:43.724985    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:43.725000    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:43.725006    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:43.725010    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:43.727056    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:43.727478    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:43.727485    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:43.727491    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:43.727494    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:43.729068    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:44.224095    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:44.224121    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:44.224133    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:44.224181    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:44.227349    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:44.228120    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:44.228128    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:44.228134    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:44.228138    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:44.229966    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:44.725021    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:44.725045    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:44.725058    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:44.725062    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:44.729238    3744 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:37:44.729727    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:44.729735    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:44.729741    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:44.729745    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:44.731433    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:44.731726    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:45.225302    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:45.225330    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:45.225341    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:45.225347    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:45.228863    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:45.229379    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:45.229389    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:45.229397    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:45.229401    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:45.231429    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:45.724243    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:45.724324    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:45.724337    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:45.724344    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:45.727683    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:45.728405    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:45.728413    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:45.728419    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:45.728422    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:45.730098    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:46.223716    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:46.223773    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:46.223788    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:46.223796    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:46.227605    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:46.228067    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:46.228076    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:46.228082    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:46.228085    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:46.229768    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:46.724565    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:46.724619    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:46.724633    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:46.724641    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:46.728150    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:46.728985    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:46.728992    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:46.728998    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:46.729001    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:46.730855    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:47.224578    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:47.224599    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:47.224612    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:47.224618    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:47.227578    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:47.228002    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:47.228009    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:47.228015    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:47.228018    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:47.229721    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:47.230041    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:47.724560    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:47.724585    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:47.724594    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:47.724599    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:47.728122    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:47.728734    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:47.728742    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:47.728748    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:47.728751    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:47.730435    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:48.223615    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:48.223629    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:48.223636    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:48.223640    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:48.226095    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:48.226577    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:48.226586    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:48.226591    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:48.226598    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:48.228415    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:48.724122    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:48.724142    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:48.724153    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:48.724160    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:48.727651    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:48.728172    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:48.728183    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:48.728191    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:48.728195    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:48.729902    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:49.223260    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:49.223281    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:49.223292    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:49.223298    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:49.226301    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:49.226932    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:49.226940    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:49.226945    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:49.226947    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:49.228480    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:49.724076    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:49.724109    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:49.724120    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:49.724127    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:49.727544    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:49.728275    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:49.728284    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:49.728290    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:49.728293    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:49.729994    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:49.730332    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:50.223448    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:50.223462    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:50.223471    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:50.223475    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:50.225685    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:50.226217    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:50.226225    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:50.226231    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:50.226242    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:50.228286    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:50.723871    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:50.723896    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:50.723910    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:50.723918    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:50.727053    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:50.728013    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:50.728021    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:50.728027    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:50.728033    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:50.729924    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:51.223394    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:51.223411    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:51.223419    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:51.223424    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:51.226019    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:51.226638    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:51.226646    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:51.226652    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:51.226662    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:51.228242    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:51.724305    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:51.724331    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:51.724341    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:51.724348    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:51.728121    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:51.728579    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:51.728588    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:51.728593    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:51.728603    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:51.730578    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:51.730868    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:52.223952    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:52.224012    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:52.224021    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:52.224025    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:52.226458    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:52.227072    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:52.227080    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:52.227087    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:52.227090    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:52.228719    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:52.724240    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:52.724287    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:52.724299    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:52.724308    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:52.727394    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:52.727827    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:52.727834    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:52.727840    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:52.727844    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:52.729417    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:53.224920    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:53.225020    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:53.225037    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:53.225045    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:53.228826    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:53.229364    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:53.229374    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:53.229380    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:53.229387    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:53.231081    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:53.723365    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:53.723381    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:53.723393    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:53.723397    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:53.725512    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:53.725934    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:53.725942    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:53.725948    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:53.725951    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:53.727517    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:54.223251    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:54.223290    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.223310    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.223318    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.225362    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.225778    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:54.225786    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.225792    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.225797    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.227316    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:54.227664    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:54.723470    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:54.723553    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.723566    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.723572    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.726339    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.727040    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:54.727047    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.727053    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.727056    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.729195    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.729717    3744 pod_ready.go:93] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:54.729726    3744 pod_ready.go:82] duration metric: took 18.006599646s for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.729733    3744 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.729768    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-snq8s
	I0831 15:37:54.729773    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.729779    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.729782    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.731747    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:54.732348    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:54.732355    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.732364    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.732369    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.734207    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:54.734716    3744 pod_ready.go:93] pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:54.734725    3744 pod_ready.go:82] duration metric: took 4.986587ms for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.734738    3744 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.734775    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000
	I0831 15:37:54.734780    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.734785    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.734789    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.736900    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.737556    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:54.737563    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.737569    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.737573    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.739693    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.740047    3744 pod_ready.go:93] pod "etcd-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:54.740059    3744 pod_ready.go:82] duration metric: took 5.312281ms for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.740065    3744 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.740098    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m02
	I0831 15:37:54.740102    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.740108    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.740113    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.742355    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.742925    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:54.742933    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.742939    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.742944    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.744985    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.745483    3744 pod_ready.go:93] pod "etcd-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:54.745493    3744 pod_ready.go:82] duration metric: took 5.421796ms for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.745499    3744 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.745536    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m03
	I0831 15:37:54.745541    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.745547    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.745550    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.747563    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.748056    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:54.748063    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.748069    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.748071    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.749754    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:54.750027    3744 pod_ready.go:93] pod "etcd-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:54.750036    3744 pod_ready.go:82] duration metric: took 4.531272ms for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.750045    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.924527    3744 request.go:632] Waited for 174.448251ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:37:54.924561    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:37:54.924565    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.924570    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.924576    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.926540    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:55.124217    3744 request.go:632] Waited for 197.191409ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:55.124320    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:55.124331    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:55.124342    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:55.124349    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:55.127699    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:55.127979    3744 pod_ready.go:93] pod "kube-apiserver-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:55.127988    3744 pod_ready.go:82] duration metric: took 377.933462ms for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:55.127995    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:55.323995    3744 request.go:632] Waited for 195.947787ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:37:55.324122    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:37:55.324133    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:55.324142    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:55.324147    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:55.326536    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:55.524340    3744 request.go:632] Waited for 197.377407ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:55.524428    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:55.524437    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:55.524444    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:55.524458    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:55.527694    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:55.528065    3744 pod_ready.go:93] pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:55.528075    3744 pod_ready.go:82] duration metric: took 400.071053ms for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:55.528082    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:55.724069    3744 request.go:632] Waited for 195.89026ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:37:55.724147    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:37:55.724160    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:55.724178    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:55.724193    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:55.727264    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:55.924174    3744 request.go:632] Waited for 196.444661ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:55.924262    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:55.924273    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:55.924284    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:55.924290    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:55.927217    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:55.927667    3744 pod_ready.go:93] pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:55.927677    3744 pod_ready.go:82] duration metric: took 399.585518ms for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:55.927691    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:56.123773    3744 request.go:632] Waited for 195.997614ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:37:56.123824    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:37:56.123834    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:56.123859    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:56.123868    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:56.126826    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:56.323602    3744 request.go:632] Waited for 196.242245ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:56.323669    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:56.323713    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:56.323725    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:56.323736    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:56.326205    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:56.326487    3744 pod_ready.go:93] pod "kube-controller-manager-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:56.326497    3744 pod_ready.go:82] duration metric: took 398.79568ms for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:56.326504    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:56.525262    3744 request.go:632] Waited for 198.697997ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:56.525404    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:56.525415    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:56.525426    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:56.525435    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:56.528812    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:56.723576    3744 request.go:632] Waited for 194.289214ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:56.723635    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:56.723642    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:56.723648    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:56.723664    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:56.725655    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:56.726101    3744 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:56.726110    3744 pod_ready.go:82] duration metric: took 399.596067ms for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:56.726117    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:56.923811    3744 request.go:632] Waited for 197.624636ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:37:56.923859    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:37:56.923866    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:56.923874    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:56.923879    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:56.926307    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:57.123874    3744 request.go:632] Waited for 197.165319ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:57.123948    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:57.123956    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:57.123964    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:57.123981    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:57.126673    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:57.127130    3744 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:57.127139    3744 pod_ready.go:82] duration metric: took 401.01276ms for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:57.127146    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:57.323575    3744 request.go:632] Waited for 196.38297ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:37:57.323627    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:37:57.323635    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:57.323646    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:57.323654    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:57.326792    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:57.524981    3744 request.go:632] Waited for 197.675488ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:57.525056    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:57.525064    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:57.525072    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:57.525077    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:57.527436    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:57.527834    3744 pod_ready.go:93] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:57.527844    3744 pod_ready.go:82] duration metric: took 400.687607ms for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:57.527851    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:57.724761    3744 request.go:632] Waited for 196.867729ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-d45q5
	I0831 15:37:57.724843    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-d45q5
	I0831 15:37:57.724852    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:57.724860    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:57.724864    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:57.727338    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:57.924277    3744 request.go:632] Waited for 196.366483ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:57.924352    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:57.924361    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:57.924369    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:57.924376    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:57.926744    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:57.927036    3744 pod_ready.go:93] pod "kube-proxy-d45q5" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:57.927045    3744 pod_ready.go:82] duration metric: took 399.185058ms for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:57.927052    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:58.123932    3744 request.go:632] Waited for 196.831846ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:58.124040    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:58.124050    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:58.124062    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:58.124067    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:58.127075    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:58.323899    3744 request.go:632] Waited for 196.438465ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:58.323934    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:58.323939    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:58.323946    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:58.323982    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:58.326076    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:58.326347    3744 pod_ready.go:93] pod "kube-proxy-q7ndn" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:58.326358    3744 pod_ready.go:82] duration metric: took 399.29367ms for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:58.326365    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:58.524333    3744 request.go:632] Waited for 197.864558ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:37:58.524448    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:37:58.524460    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:58.524471    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:58.524478    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:58.527937    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:58.724668    3744 request.go:632] Waited for 196.043209ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:58.724763    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:58.724780    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:58.724797    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:58.724815    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:58.727732    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:58.728090    3744 pod_ready.go:93] pod "kube-scheduler-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:58.728099    3744 pod_ready.go:82] duration metric: took 401.725065ms for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:58.728105    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:58.925170    3744 request.go:632] Waited for 197.0037ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:37:58.925325    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:37:58.925339    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:58.925351    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:58.925358    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:58.928967    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:59.124043    3744 request.go:632] Waited for 194.666869ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:59.124133    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:59.124143    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:59.124154    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:59.124161    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:59.127137    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:59.127523    3744 pod_ready.go:93] pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:59.127532    3744 pod_ready.go:82] duration metric: took 399.417767ms for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:59.127541    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:59.324020    3744 request.go:632] Waited for 196.418346ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:37:59.324169    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:37:59.324180    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:59.324191    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:59.324200    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:59.327657    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:59.523961    3744 request.go:632] Waited for 195.650623ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:59.524073    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:59.524086    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:59.524097    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:59.524105    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:59.527091    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:59.527542    3744 pod_ready.go:93] pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:59.527550    3744 pod_ready.go:82] duration metric: took 399.999976ms for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:59.527558    3744 pod_ready.go:39] duration metric: took 22.814715363s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:37:59.527569    3744 api_server.go:52] waiting for apiserver process to appear ...
	I0831 15:37:59.527620    3744 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:37:59.540037    3744 api_server.go:72] duration metric: took 23.068203242s to wait for apiserver process to appear ...
	I0831 15:37:59.540049    3744 api_server.go:88] waiting for apiserver healthz status ...
	I0831 15:37:59.540059    3744 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0831 15:37:59.543113    3744 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0831 15:37:59.543146    3744 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0831 15:37:59.543150    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:59.543156    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:59.543161    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:59.543866    3744 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 15:37:59.543927    3744 api_server.go:141] control plane version: v1.31.0
	I0831 15:37:59.543936    3744 api_server.go:131] duration metric: took 3.882759ms to wait for apiserver health ...
	I0831 15:37:59.543942    3744 system_pods.go:43] waiting for kube-system pods to appear ...
	I0831 15:37:59.723587    3744 request.go:632] Waited for 179.596374ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:37:59.723694    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:37:59.723708    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:59.723718    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:59.723734    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:59.728359    3744 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:37:59.733656    3744 system_pods.go:59] 24 kube-system pods found
	I0831 15:37:59.733668    3744 system_pods.go:61] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:37:59.733672    3744 system_pods.go:61] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:37:59.733676    3744 system_pods.go:61] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:37:59.733679    3744 system_pods.go:61] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:37:59.733681    3744 system_pods.go:61] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:37:59.733684    3744 system_pods.go:61] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:37:59.733686    3744 system_pods.go:61] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:37:59.733689    3744 system_pods.go:61] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:37:59.733691    3744 system_pods.go:61] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:37:59.733694    3744 system_pods.go:61] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:37:59.733696    3744 system_pods.go:61] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:37:59.733699    3744 system_pods.go:61] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:37:59.733702    3744 system_pods.go:61] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:37:59.733705    3744 system_pods.go:61] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:37:59.733708    3744 system_pods.go:61] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:37:59.733710    3744 system_pods.go:61] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:37:59.733714    3744 system_pods.go:61] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:37:59.733718    3744 system_pods.go:61] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:37:59.733721    3744 system_pods.go:61] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:37:59.733724    3744 system_pods.go:61] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:37:59.733726    3744 system_pods.go:61] "kube-vip-ha-949000" [98967a2c-6641-4193-b7ce-c0fbdee58344] Running
	I0831 15:37:59.733729    3744 system_pods.go:61] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:37:59.733731    3744 system_pods.go:61] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:37:59.733734    3744 system_pods.go:61] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:37:59.733738    3744 system_pods.go:74] duration metric: took 189.789494ms to wait for pod list to return data ...
	I0831 15:37:59.733743    3744 default_sa.go:34] waiting for default service account to be created ...
	I0831 15:37:59.923784    3744 request.go:632] Waited for 189.987121ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:37:59.923870    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:37:59.923881    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:59.923893    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:59.923900    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:59.927288    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:59.927352    3744 default_sa.go:45] found service account: "default"
	I0831 15:37:59.927361    3744 default_sa.go:55] duration metric: took 193.611323ms for default service account to be created ...
	I0831 15:37:59.927366    3744 system_pods.go:116] waiting for k8s-apps to be running ...
	I0831 15:38:00.124803    3744 request.go:632] Waited for 197.388029ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:38:00.124898    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:38:00.124909    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:00.124920    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:00.124939    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:00.129956    3744 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0831 15:38:00.134973    3744 system_pods.go:86] 24 kube-system pods found
	I0831 15:38:00.134985    3744 system_pods.go:89] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:38:00.134989    3744 system_pods.go:89] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:38:00.134993    3744 system_pods.go:89] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:38:00.134996    3744 system_pods.go:89] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:38:00.134999    3744 system_pods.go:89] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:38:00.135002    3744 system_pods.go:89] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:38:00.135005    3744 system_pods.go:89] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:38:00.135008    3744 system_pods.go:89] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:38:00.135011    3744 system_pods.go:89] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:38:00.135013    3744 system_pods.go:89] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:38:00.135017    3744 system_pods.go:89] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:38:00.135019    3744 system_pods.go:89] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:38:00.135025    3744 system_pods.go:89] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:38:00.135028    3744 system_pods.go:89] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:38:00.135031    3744 system_pods.go:89] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:38:00.135034    3744 system_pods.go:89] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:38:00.135037    3744 system_pods.go:89] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:38:00.135039    3744 system_pods.go:89] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:38:00.135042    3744 system_pods.go:89] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:38:00.135045    3744 system_pods.go:89] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:38:00.135049    3744 system_pods.go:89] "kube-vip-ha-949000" [98967a2c-6641-4193-b7ce-c0fbdee58344] Running
	I0831 15:38:00.135051    3744 system_pods.go:89] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:38:00.135056    3744 system_pods.go:89] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:38:00.135060    3744 system_pods.go:89] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:38:00.135065    3744 system_pods.go:126] duration metric: took 207.692433ms to wait for k8s-apps to be running ...
	I0831 15:38:00.135070    3744 system_svc.go:44] waiting for kubelet service to be running ....
	I0831 15:38:00.135137    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:38:00.146618    3744 system_svc.go:56] duration metric: took 11.54297ms WaitForService to wait for kubelet
	I0831 15:38:00.146633    3744 kubeadm.go:582] duration metric: took 23.674794454s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:38:00.146650    3744 node_conditions.go:102] verifying NodePressure condition ...
	I0831 15:38:00.324468    3744 request.go:632] Waited for 177.772827ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0831 15:38:00.324541    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0831 15:38:00.324549    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:00.324557    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:00.324561    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:00.326804    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:38:00.327655    3744 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:38:00.327666    3744 node_conditions.go:123] node cpu capacity is 2
	I0831 15:38:00.327673    3744 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:38:00.327677    3744 node_conditions.go:123] node cpu capacity is 2
	I0831 15:38:00.327680    3744 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:38:00.327683    3744 node_conditions.go:123] node cpu capacity is 2
	I0831 15:38:00.327689    3744 node_conditions.go:105] duration metric: took 181.029342ms to run NodePressure ...
	I0831 15:38:00.327697    3744 start.go:241] waiting for startup goroutines ...
	I0831 15:38:00.327709    3744 start.go:255] writing updated cluster config ...
	I0831 15:38:00.348472    3744 out.go:201] 
	I0831 15:38:00.369311    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:38:00.369379    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:38:00.391565    3744 out.go:177] * Starting "ha-949000-m04" worker node in "ha-949000" cluster
	I0831 15:38:00.433358    3744 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:38:00.433417    3744 cache.go:56] Caching tarball of preloaded images
	I0831 15:38:00.433601    3744 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:38:00.433620    3744 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:38:00.433752    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:38:00.434936    3744 start.go:360] acquireMachinesLock for ha-949000-m04: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:38:00.435036    3744 start.go:364] duration metric: took 76.344µs to acquireMachinesLock for "ha-949000-m04"
	I0831 15:38:00.435061    3744 start.go:96] Skipping create...Using existing machine configuration
	I0831 15:38:00.435070    3744 fix.go:54] fixHost starting: m04
	I0831 15:38:00.435494    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:38:00.435519    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:38:00.444781    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51881
	I0831 15:38:00.445158    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:38:00.445521    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:38:00.445531    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:38:00.445763    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:38:00.445892    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:00.445989    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetState
	I0831 15:38:00.446076    3744 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:38:00.446156    3744 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3377
	I0831 15:38:00.447072    3744 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid 3377 missing from process table
	I0831 15:38:00.447102    3744 fix.go:112] recreateIfNeeded on ha-949000-m04: state=Stopped err=<nil>
	I0831 15:38:00.447112    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	W0831 15:38:00.447197    3744 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 15:38:00.468433    3744 out.go:177] * Restarting existing hyperkit VM for "ha-949000-m04" ...
	I0831 15:38:00.542198    3744 main.go:141] libmachine: (ha-949000-m04) Calling .Start
	I0831 15:38:00.542515    3744 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:38:00.542650    3744 main.go:141] libmachine: (ha-949000-m04) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid
	I0831 15:38:00.544312    3744 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid 3377 missing from process table
	I0831 15:38:00.544344    3744 main.go:141] libmachine: (ha-949000-m04) DBG | pid 3377 is in state "Stopped"
	I0831 15:38:00.544372    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid...
	I0831 15:38:00.544580    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Using UUID 5ee34770-2239-4427-9789-bd204fe095a6
	I0831 15:38:00.571913    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Generated MAC 8a:3c:61:5f:c5:84
	I0831 15:38:00.571940    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:38:00.572058    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"5ee34770-2239-4427-9789-bd204fe095a6", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003bec00)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:38:00.572092    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"5ee34770-2239-4427-9789-bd204fe095a6", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003bec00)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:38:00.572124    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "5ee34770-2239-4427-9789-bd204fe095a6", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/ha-949000-m04.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m04/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:38:00.572235    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 5ee34770-2239-4427-9789-bd204fe095a6 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/ha-949000-m04.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:38:00.572259    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:38:00.573709    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 DEBUG: hyperkit: Pid is 3806
	I0831 15:38:00.574064    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Attempt 0
	I0831 15:38:00.574112    3744 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:38:00.574129    3744 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3806
	I0831 15:38:00.576177    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Searching for 8a:3c:61:5f:c5:84 in /var/db/dhcpd_leases ...
	I0831 15:38:00.576262    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0831 15:38:00.576305    3744 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d4eca7}
	I0831 15:38:00.576319    3744 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ec75}
	I0831 15:38:00.576335    3744 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4ec63}
	I0831 15:38:00.576351    3744 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d4eb85}
	I0831 15:38:00.576382    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Found match: 8a:3c:61:5f:c5:84
	I0831 15:38:00.576399    3744 main.go:141] libmachine: (ha-949000-m04) DBG | IP: 192.169.0.8
	I0831 15:38:00.576410    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetConfigRaw
	I0831 15:38:00.577215    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:38:00.577389    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:38:00.577864    3744 machine.go:93] provisionDockerMachine start ...
	I0831 15:38:00.577878    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:00.578009    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:00.578108    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:00.578212    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:00.578342    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:00.578431    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:00.578558    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:38:00.578712    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:38:00.578720    3744 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 15:38:00.582294    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:38:00.590710    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:38:00.591705    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:38:00.591723    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:38:00.591734    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:38:00.591743    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:38:00.976655    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:38:00.976695    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:38:01.091423    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:01 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:38:01.091445    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:01 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:38:01.091527    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:01 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:38:01.091554    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:01 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:38:01.092272    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:01 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:38:01.092283    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:01 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:38:06.721349    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:06 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:38:06.721473    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:06 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:38:06.721482    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:06 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:38:06.745779    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:06 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:38:11.647284    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 15:38:11.647298    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetMachineName
	I0831 15:38:11.647457    3744 buildroot.go:166] provisioning hostname "ha-949000-m04"
	I0831 15:38:11.647468    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetMachineName
	I0831 15:38:11.647566    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:11.647657    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:11.647737    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:11.647830    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:11.647929    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:11.648056    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:38:11.648211    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:38:11.648224    3744 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m04 && echo "ha-949000-m04" | sudo tee /etc/hostname
	I0831 15:38:11.720881    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m04
	
	I0831 15:38:11.720895    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:11.721030    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:11.721141    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:11.721229    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:11.721323    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:11.721445    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:38:11.721583    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:38:11.721594    3744 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m04' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m04/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m04' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:38:11.787551    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:38:11.787565    3744 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:38:11.787574    3744 buildroot.go:174] setting up certificates
	I0831 15:38:11.787580    3744 provision.go:84] configureAuth start
	I0831 15:38:11.787586    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetMachineName
	I0831 15:38:11.787717    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:38:11.787807    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:11.787897    3744 provision.go:143] copyHostCerts
	I0831 15:38:11.787923    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:38:11.787987    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:38:11.787993    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:38:11.788130    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:38:11.788325    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:38:11.788370    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:38:11.788375    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:38:11.788450    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:38:11.788631    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:38:11.788686    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:38:11.788692    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:38:11.788777    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:38:11.788936    3744 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m04 san=[127.0.0.1 192.169.0.8 ha-949000-m04 localhost minikube]
	I0831 15:38:11.923616    3744 provision.go:177] copyRemoteCerts
	I0831 15:38:11.923670    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:38:11.923684    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:11.923822    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:11.923908    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:11.924002    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:11.924089    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:38:11.965052    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:38:11.965128    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:38:11.989075    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:38:11.989152    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:38:12.008938    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:38:12.009008    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0831 15:38:12.028923    3744 provision.go:87] duration metric: took 241.333371ms to configureAuth
	I0831 15:38:12.028939    3744 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:38:12.029131    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:38:12.029146    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:12.029282    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:12.029361    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:12.029448    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:12.029527    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:12.029620    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:12.029746    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:38:12.029867    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:38:12.029874    3744 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:38:12.090450    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:38:12.090463    3744 buildroot.go:70] root file system type: tmpfs
	I0831 15:38:12.090535    3744 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:38:12.090548    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:12.090681    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:12.090786    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:12.090898    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:12.091016    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:12.091186    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:38:12.091325    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:38:12.091371    3744 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:38:12.161741    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	Environment=NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:38:12.161767    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:12.161902    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:12.161995    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:12.162101    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:12.162204    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:12.162325    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:38:12.162467    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:38:12.162479    3744 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:38:13.717080    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:38:13.717094    3744 machine.go:96] duration metric: took 13.139081069s to provisionDockerMachine
	I0831 15:38:13.717101    3744 start.go:293] postStartSetup for "ha-949000-m04" (driver="hyperkit")
	I0831 15:38:13.717109    3744 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:38:13.717123    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:13.717308    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:38:13.717321    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:13.717411    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:13.717514    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:13.717598    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:13.717686    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:38:13.753970    3744 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:38:13.757041    3744 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:38:13.757049    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:38:13.757147    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:38:13.757317    3744 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:38:13.757323    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:38:13.757520    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:38:13.764743    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:38:13.784545    3744 start.go:296] duration metric: took 67.430377ms for postStartSetup
	I0831 15:38:13.784594    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:13.784782    3744 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0831 15:38:13.784795    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:13.784891    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:13.784980    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:13.785074    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:13.785157    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:38:13.822419    3744 machine.go:197] restoring vm config from /var/lib/minikube/backup: [etc]
	I0831 15:38:13.822478    3744 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0831 15:38:13.856251    3744 fix.go:56] duration metric: took 13.421034183s for fixHost
	I0831 15:38:13.856276    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:13.856412    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:13.856504    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:13.856591    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:13.856670    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:13.856794    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:38:13.856933    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:38:13.856940    3744 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:38:13.917606    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143893.981325007
	
	I0831 15:38:13.917619    3744 fix.go:216] guest clock: 1725143893.981325007
	I0831 15:38:13.917634    3744 fix.go:229] Guest: 2024-08-31 15:38:13.981325007 -0700 PDT Remote: 2024-08-31 15:38:13.856265 -0700 PDT m=+124.128653576 (delta=125.060007ms)
	I0831 15:38:13.917650    3744 fix.go:200] guest clock delta is within tolerance: 125.060007ms
	I0831 15:38:13.917655    3744 start.go:83] releasing machines lock for "ha-949000-m04", held for 13.482464465s
	I0831 15:38:13.917676    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:13.917802    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:38:13.942019    3744 out.go:177] * Found network options:
	I0831 15:38:13.963076    3744 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	W0831 15:38:13.984049    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:38:13.984067    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:38:13.984075    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:38:13.984086    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:13.984514    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:13.984633    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:13.984692    3744 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:38:13.984722    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	W0831 15:38:13.984773    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:38:13.984786    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:38:13.984810    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	W0831 15:38:13.984809    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:38:13.984873    3744 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:38:13.984894    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:13.984907    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:13.984995    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:13.985009    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:13.985085    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:13.985105    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:38:13.985186    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:13.985271    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	W0831 15:38:14.024342    3744 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:38:14.024407    3744 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:38:14.067158    3744 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:38:14.067173    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:38:14.067244    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:38:14.082520    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:38:14.090779    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:38:14.099040    3744 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:38:14.099091    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:38:14.107242    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:38:14.115660    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:38:14.124011    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:38:14.132309    3744 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:38:14.140696    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:38:14.149089    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:38:14.157409    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:38:14.165662    3744 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:38:14.173102    3744 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:38:14.180728    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:38:14.276483    3744 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:38:14.296705    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:38:14.296785    3744 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:38:14.312751    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:38:14.325397    3744 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:38:14.342774    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:38:14.353024    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:38:14.363251    3744 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:38:14.380028    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:38:14.390424    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:38:14.405244    3744 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:38:14.408231    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:38:14.415934    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:38:14.429648    3744 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:38:14.529094    3744 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:38:14.646662    3744 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:38:14.646690    3744 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:38:14.660870    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:38:14.760474    3744 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:38:17.038586    3744 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.278065529s)
	I0831 15:38:17.038650    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:38:17.049008    3744 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:38:17.062620    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:38:17.073607    3744 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:38:17.168850    3744 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:38:17.269764    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:38:17.377489    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:38:17.390666    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:38:17.402072    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:38:17.507294    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:38:17.568987    3744 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:38:17.569066    3744 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:38:17.574853    3744 start.go:563] Will wait 60s for crictl version
	I0831 15:38:17.574909    3744 ssh_runner.go:195] Run: which crictl
	I0831 15:38:17.578814    3744 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:38:17.605368    3744 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:38:17.605446    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:38:17.624343    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:38:17.679051    3744 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:38:17.753456    3744 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:38:17.812386    3744 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0831 15:38:17.902651    3744 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	I0831 15:38:17.924439    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:38:17.924700    3744 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:38:17.928251    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:38:17.938446    3744 mustload.go:65] Loading cluster: ha-949000
	I0831 15:38:17.938620    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:38:17.938850    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:38:17.938873    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:38:17.947622    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51903
	I0831 15:38:17.948032    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:38:17.948446    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:38:17.948460    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:38:17.948674    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:38:17.948791    3744 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:38:17.948881    3744 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:38:17.948987    3744 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 3756
	I0831 15:38:17.950000    3744 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:38:17.950260    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:38:17.950294    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:38:17.959428    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51905
	I0831 15:38:17.959777    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:38:17.960145    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:38:17.960162    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:38:17.960360    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:38:17.960471    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:38:17.960562    3744 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.8
	I0831 15:38:17.960568    3744 certs.go:194] generating shared ca certs ...
	I0831 15:38:17.960576    3744 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:38:17.960771    3744 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:38:17.960844    3744 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:38:17.960854    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:38:17.960878    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:38:17.960897    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:38:17.960914    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:38:17.961001    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:38:17.961051    3744 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:38:17.961060    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:38:17.961098    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:38:17.961130    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:38:17.961166    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:38:17.961235    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:38:17.961269    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:38:17.961290    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:38:17.961312    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:38:17.961342    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:38:17.980971    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:38:18.000269    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:38:18.019936    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:38:18.039774    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:38:18.059357    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:38:18.078502    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:38:18.097967    3744 ssh_runner.go:195] Run: openssl version
	I0831 15:38:18.102444    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:38:18.111969    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:38:18.115584    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:38:18.115639    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:38:18.119889    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:38:18.129130    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:38:18.138067    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:38:18.141420    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:38:18.141464    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:38:18.145592    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:38:18.154725    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:38:18.163859    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:38:18.167695    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:38:18.167749    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:38:18.172178    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:38:18.181412    3744 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:38:18.184441    3744 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 15:38:18.184478    3744 kubeadm.go:934] updating node {m04 192.169.0.8 0 v1.31.0  false true} ...
	I0831 15:38:18.184543    3744 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m04 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.8
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:38:18.184588    3744 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:38:18.192672    3744 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.31.0: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.31.0': No such file or directory
	
	Initiating transfer...
	I0831 15:38:18.192722    3744 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.31.0
	I0831 15:38:18.201203    3744 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubeadm.sha256
	I0831 15:38:18.201203    3744 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl.sha256
	I0831 15:38:18.201205    3744 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubelet.sha256
	I0831 15:38:18.201219    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm -> /var/lib/minikube/binaries/v1.31.0/kubeadm
	I0831 15:38:18.201219    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl -> /var/lib/minikube/binaries/v1.31.0/kubectl
	I0831 15:38:18.201260    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:38:18.201327    3744 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubeadm
	I0831 15:38:18.201327    3744 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubectl
	I0831 15:38:18.213304    3744 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubeadm: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubeadm': No such file or directory
	I0831 15:38:18.213305    3744 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubectl: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubectl': No such file or directory
	I0831 15:38:18.213306    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet -> /var/lib/minikube/binaries/v1.31.0/kubelet
	I0831 15:38:18.213339    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl --> /var/lib/minikube/binaries/v1.31.0/kubectl (56381592 bytes)
	I0831 15:38:18.213339    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm --> /var/lib/minikube/binaries/v1.31.0/kubeadm (58290328 bytes)
	I0831 15:38:18.213434    3744 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubelet
	I0831 15:38:18.234959    3744 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubelet: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubelet': No such file or directory
	I0831 15:38:18.235000    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet --> /var/lib/minikube/binaries/v1.31.0/kubelet (76865848 bytes)
	I0831 15:38:18.870025    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system
	I0831 15:38:18.878175    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:38:18.892204    3744 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:38:18.906289    3744 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:38:18.909279    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:38:18.919652    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:38:19.014285    3744 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:38:19.030068    3744 start.go:235] Will wait 6m0s for node &{Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime: ControlPlane:false Worker:true}
	I0831 15:38:19.030257    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:38:19.052807    3744 out.go:177] * Verifying Kubernetes components...
	I0831 15:38:19.073469    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:38:19.170855    3744 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:38:19.775316    3744 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:38:19.775538    3744 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0xfc63c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:38:19.775580    3744 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:38:19.775737    3744 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m04" to be "Ready" ...
	I0831 15:38:19.775777    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:19.775783    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:19.775789    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:19.775793    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:19.778097    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:20.276562    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:20.276584    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:20.276613    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:20.276621    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:20.279146    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:20.777079    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:20.777090    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:20.777097    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:20.777101    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:20.779128    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:21.277261    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:21.277277    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:21.277283    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:21.277286    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:21.279452    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:21.776272    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:21.776285    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:21.776292    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:21.776295    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:21.778482    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:21.778547    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:22.276209    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:22.276224    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:22.276233    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:22.276239    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:22.278431    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:22.775916    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:22.775932    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:22.775939    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:22.775943    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:22.778178    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:23.276360    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:23.276381    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:23.276392    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:23.276398    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:23.279406    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:23.775977    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:23.775995    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:23.776032    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:23.776037    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:23.778193    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:24.277072    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:24.277087    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:24.277093    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:24.277097    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:24.279300    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:24.279384    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:24.777071    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:24.777083    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:24.777089    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:24.777093    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:24.779084    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:38:25.277302    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:25.277326    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:25.277343    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:25.277370    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:25.280739    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:25.777360    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:25.777375    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:25.777382    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:25.777386    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:25.779584    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:26.277703    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:26.277720    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:26.277728    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:26.277739    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:26.279789    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:26.279858    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:26.776231    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:26.776272    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:26.776280    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:26.776285    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:26.778315    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:27.276174    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:27.276188    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:27.276194    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:27.276197    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:27.278437    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:27.776689    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:27.776708    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:27.776717    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:27.776721    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:27.779053    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:28.276081    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:28.276100    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:28.276111    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:28.276117    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:28.279235    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:28.776709    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:28.776722    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:28.776728    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:28.776732    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:28.778876    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:28.778948    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:29.276276    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:29.276292    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:29.276300    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:29.276306    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:29.278917    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:29.776120    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:29.776137    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:29.776147    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:29.776153    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:29.778926    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:30.277099    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:30.277114    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:30.277119    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:30.277121    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:30.279209    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:30.776289    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:30.776306    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:30.776318    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:30.776325    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:30.778950    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:30.779042    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:31.277113    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:31.277129    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:31.277137    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:31.277142    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:31.279308    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:31.776871    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:31.776885    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:31.776892    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:31.776907    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:31.779110    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:32.276639    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:32.276666    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:32.276677    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:32.276709    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:32.279642    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:32.776964    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:32.777005    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:32.777013    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:32.777017    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:32.778916    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:38:33.276097    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:33.276113    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:33.276120    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:33.276123    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:33.278201    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:33.278323    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:33.778025    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:33.778051    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:33.778062    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:33.778067    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:33.781122    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:34.277596    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:34.277611    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:34.277617    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:34.277620    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:34.279507    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:38:34.776042    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:34.776055    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:34.776061    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:34.776064    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:34.778134    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:35.276180    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:35.276203    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:35.276281    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:35.276292    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:35.279248    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:35.279324    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:35.776557    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:35.776577    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:35.776588    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:35.776595    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:35.779906    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:36.276525    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:36.276541    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:36.276547    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:36.276550    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:36.278734    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:36.776417    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:36.776489    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:36.776512    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:36.776522    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:36.779524    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:37.277720    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:37.277733    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:37.277739    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:37.277743    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:37.279925    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:37.279984    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:37.777252    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:37.777269    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:37.777274    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:37.777277    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:37.779271    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:38:38.278156    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:38.278211    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:38.278223    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:38.278229    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:38.280712    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:38.776178    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:38.776203    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:38.776213    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:38.776219    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:38.779093    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:39.276872    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:39.276885    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:39.276892    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:39.276896    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:39.279063    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:39.776884    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:39.776898    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:39.776905    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:39.776909    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:39.779259    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:39.779362    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:40.277202    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:40.277230    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:40.277242    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:40.277249    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:40.280416    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:40.776384    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:40.776396    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:40.776403    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:40.776406    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:40.778591    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:41.276444    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:41.276465    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:41.276477    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:41.276482    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:41.279236    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:41.777547    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:41.777633    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:41.777648    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:41.777658    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:41.780834    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:41.780914    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:42.276626    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:42.276639    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:42.276646    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:42.276649    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:42.278771    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:42.777502    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:42.777527    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:42.777539    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:42.777544    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:42.780668    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:43.277508    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:43.277532    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:43.277544    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:43.277551    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:43.281198    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:43.777290    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:43.777306    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:43.777313    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:43.777316    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:43.779556    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:44.277098    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:44.277133    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:44.277144    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:44.277150    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:44.280482    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:44.280570    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:44.776182    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:44.776196    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:44.776204    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:44.776210    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:44.778630    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:45.276509    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:45.276522    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:45.276528    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:45.276540    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:45.278778    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:45.776791    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:45.776866    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:45.776879    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:45.776888    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:45.779812    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:46.277629    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:46.277650    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:46.277661    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:46.277669    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:46.280694    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:46.280771    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:46.776617    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:46.776632    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:46.776639    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:46.776644    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:46.778705    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:47.276853    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:47.276872    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:47.276881    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:47.276886    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:47.279224    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:47.777691    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:47.777716    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:47.777764    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:47.777772    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:47.780764    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:48.276263    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:48.276280    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:48.276286    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:48.276289    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:48.278387    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:48.776798    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:48.776866    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:48.776876    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:48.776880    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:48.779266    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:48.779328    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:49.277706    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:49.277731    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:49.277798    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:49.277809    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:49.280441    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:49.776295    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:49.776306    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:49.776312    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:49.776320    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:49.778554    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:50.278315    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:50.278338    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:50.278403    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:50.278414    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:50.281533    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:50.777763    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:50.777778    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:50.777787    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:50.777796    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:50.780173    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:50.780239    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:51.276316    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:51.276332    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:51.276338    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:51.276342    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:51.278631    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:51.776296    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:51.776316    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:51.776325    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:51.776330    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:51.778726    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:52.276790    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:52.276847    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:52.276864    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:52.276870    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:52.279948    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:52.777099    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:52.777115    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:52.777121    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:52.777126    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:52.779325    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:53.276819    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:53.276881    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:53.276895    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:53.276904    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:53.279807    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:53.279883    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:53.776517    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:53.776532    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:53.776539    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:53.776543    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:53.778686    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:54.276276    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:54.276289    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:54.276299    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:54.276302    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:54.278627    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:54.777871    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:54.777890    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:54.777900    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:54.777906    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:54.781132    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:55.276882    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:55.276901    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:55.276913    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:55.276919    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:55.280226    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:55.280299    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:55.777001    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:55.777014    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:55.777020    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:55.777023    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:55.779025    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:38:56.277691    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:56.277714    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:56.277726    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:56.277731    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:56.280819    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:56.778188    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:56.778247    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:56.778257    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:56.778262    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:56.780685    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:57.276330    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:57.276344    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:57.276350    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:57.276354    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:57.278527    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:57.776849    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:57.776867    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:57.776875    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:57.776880    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:57.779132    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:57.779222    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:58.276676    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:58.276715    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:58.276723    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:58.276727    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:58.278722    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:38:58.776823    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:58.776841    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:58.776847    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:58.776851    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:58.779004    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:59.277009    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:59.277030    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:59.277041    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:59.277049    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:59.280147    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:59.776972    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:59.776990    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:59.776999    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:59.777007    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:59.780392    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:59.780554    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:00.278237    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:00.278268    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:00.278275    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:00.278279    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:00.280339    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:00.776782    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:00.776803    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:00.776814    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:00.776819    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:00.780040    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:01.276687    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:01.276709    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:01.276717    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:01.276722    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:01.279213    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:01.776982    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:01.776997    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:01.777004    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:01.777008    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:01.779255    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:02.278179    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:02.278239    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:02.278253    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:02.278261    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:02.281537    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:02.281611    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:02.776749    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:02.776775    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:02.776786    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:02.776793    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:02.780000    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:03.278084    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:03.278100    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:03.278108    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:03.278112    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:03.280525    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:03.776918    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:03.776956    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:03.777002    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:03.777009    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:03.780638    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:04.278461    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:04.278485    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:04.278497    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:04.278502    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:04.281718    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:04.281791    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:04.776376    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:04.776389    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:04.776395    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:04.776398    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:04.778509    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:05.276848    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:05.276872    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:05.276883    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:05.276889    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:05.279765    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:05.776397    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:05.776423    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:05.776433    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:05.776439    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:05.779793    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:06.277954    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:06.277969    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:06.277977    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:06.277981    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:06.280446    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:06.777008    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:06.777039    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:06.777100    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:06.777109    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:06.780058    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:06.780166    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:07.277181    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:07.277203    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:07.277217    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:07.277222    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:07.280356    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:07.777895    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:07.777940    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:07.777952    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:07.777957    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:07.780087    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:08.276718    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:08.276745    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:08.276757    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:08.276763    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:08.279711    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:08.777099    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:08.777121    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:08.777132    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:08.777137    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:08.780212    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:08.780293    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:09.277158    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:09.277177    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:09.277183    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:09.277188    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:09.279328    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:09.776613    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:09.776624    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:09.776630    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:09.776635    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:09.778784    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:10.276643    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:10.276662    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:10.276674    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:10.276682    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:10.279706    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:10.776484    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:10.776495    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:10.776501    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:10.776504    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:10.778860    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:11.276917    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:11.276968    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:11.276981    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:11.276990    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:11.280015    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:11.280097    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:11.777726    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:11.777745    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:11.777753    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:11.777758    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:11.780176    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:12.278046    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:12.278058    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:12.278063    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:12.278067    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:12.280005    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:12.777919    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:12.777945    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:12.777992    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:12.777997    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:12.780507    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:13.278486    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:13.278543    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:13.278554    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:13.278559    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:13.281627    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:13.281745    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:13.776833    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:13.776854    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:13.776862    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:13.776866    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:13.779535    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:14.276922    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:14.276946    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:14.276958    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:14.276966    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:14.280174    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:14.776595    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:14.776617    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:14.776629    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:14.776634    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:14.779819    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:15.278222    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:15.278239    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:15.278247    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:15.278251    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:15.280553    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:15.776940    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:15.776965    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:15.776977    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:15.776983    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:15.780495    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:15.780576    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:16.276617    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:16.276642    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:16.276652    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:16.276656    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:16.279277    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:16.776588    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:16.776609    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:16.776618    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:16.776622    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:16.778820    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:17.277548    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:17.277568    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:17.277581    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:17.277590    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:17.280937    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:17.776562    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:17.776588    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:17.776600    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:17.776607    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:17.780085    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:18.277015    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:18.277036    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:18.277048    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:18.277056    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:18.280239    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:18.280318    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:18.777930    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:18.777950    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:18.777961    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:18.777968    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:18.781102    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:19.278645    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:19.278671    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:19.278683    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:19.278689    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:19.282270    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:19.778214    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:19.778225    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:19.778230    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:19.778234    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:19.780140    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:20.277051    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:20.277079    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:20.277090    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:20.277098    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:20.280382    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:20.280543    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:20.776682    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:20.776706    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:20.776719    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:20.776724    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:20.780231    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:21.278070    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:21.278085    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:21.278092    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:21.278096    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:21.280488    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:21.776700    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:21.776723    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:21.776735    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:21.776743    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:21.779589    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:22.276945    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:22.276985    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:22.276996    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:22.277001    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:22.279378    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:22.777198    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:22.777217    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:22.777226    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:22.777230    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:22.779837    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:22.779899    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:23.277517    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:23.277532    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:23.277540    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:23.277546    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:23.280021    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:23.776652    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:23.776672    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:23.776680    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:23.776685    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:23.779129    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:24.277535    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:24.277618    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:24.277631    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:24.277637    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:24.280844    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:24.776736    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:24.776755    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:24.776767    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:24.776774    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:24.779817    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:25.277529    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:25.277549    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:25.277560    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:25.277564    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:25.280343    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:25.280414    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:25.777390    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:25.777407    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:25.777415    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:25.777419    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:25.779809    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:26.277450    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:26.277472    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:26.277485    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:26.277492    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:26.279869    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:26.776900    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:26.776921    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:26.776929    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:26.776934    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:26.779045    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:27.277440    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:27.277457    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:27.277463    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:27.277467    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:27.279629    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:27.776631    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:27.776647    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:27.776655    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:27.776660    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:27.779236    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:27.779329    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:28.276659    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:28.276685    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:28.276697    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:28.276704    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:28.279990    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:28.777285    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:28.777319    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:28.777326    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:28.777330    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:28.779470    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:29.276786    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:29.276806    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:29.276818    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:29.276824    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:29.279639    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:29.777308    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:29.777319    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:29.777325    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:29.777328    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:29.779377    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:29.779445    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:30.277508    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:30.277524    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:30.277530    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:30.277535    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:30.279611    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:30.778698    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:30.778722    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:30.778737    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:30.778746    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:30.781722    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:31.276851    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:31.276867    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:31.276876    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:31.276888    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:31.279490    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:31.778105    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:31.778123    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:31.778133    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:31.778137    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:31.780442    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:31.780510    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:32.278437    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:32.278459    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:32.278471    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:32.278476    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:32.281165    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:32.778521    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:32.778597    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:32.778610    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:32.778618    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:32.781925    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:33.276802    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:33.276823    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:33.276832    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:33.276837    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:33.279437    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:33.777585    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:33.777608    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:33.777620    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:33.777629    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:33.780773    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:33.780858    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:34.277701    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:34.277717    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:34.277723    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:34.277726    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:34.279795    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:34.777419    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:34.777432    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:34.777438    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:34.777442    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:34.779621    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:35.276815    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:35.276837    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:35.276847    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:35.276852    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:35.279717    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:35.778287    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:35.778312    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:35.778399    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:35.778409    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:35.781136    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:35.781210    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:36.276900    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:36.276915    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:36.276922    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:36.276925    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:36.279177    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:36.777030    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:36.777056    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:36.777068    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:36.777075    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:36.780399    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:37.276789    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:37.276805    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:37.276814    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:37.276819    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:37.279300    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:37.777098    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:37.777112    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:37.777117    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:37.777121    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:37.779283    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:38.277802    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:38.277865    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:38.277876    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:38.277884    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:38.279839    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:38.279898    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:38.777985    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:38.778008    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:38.778021    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:38.778027    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:38.781190    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:39.278167    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:39.278215    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:39.278222    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:39.278227    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:39.280014    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:39.778411    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:39.778425    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:39.778433    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:39.778437    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:39.780400    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:40.276768    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:40.276779    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:40.276785    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:40.276789    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:40.278622    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:40.776752    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:40.776766    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:40.776792    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:40.776795    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:40.779016    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:40.779098    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:41.278166    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:41.278185    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:41.278202    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:41.278206    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:41.280453    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:41.776879    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:41.776941    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:41.776950    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:41.776956    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:41.779462    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:42.277893    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:42.277906    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:42.277912    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:42.277916    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:42.279774    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:42.776804    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:42.776825    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:42.776836    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:42.776841    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:42.780314    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:42.780388    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:43.277438    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:43.277453    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:43.277461    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:43.277466    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:43.279958    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:43.776777    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:43.776790    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:43.776796    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:43.776799    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:43.778854    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:44.277120    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:44.277141    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:44.277152    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:44.277167    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:44.280063    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:44.777870    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:44.777891    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:44.777902    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:44.777910    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:44.780670    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:44.780806    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:45.278440    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:45.278453    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:45.278459    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:45.278464    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:45.280687    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:45.776997    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:45.777022    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:45.777033    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:45.777045    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:45.779681    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:46.277720    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:46.277761    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:46.277771    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:46.277777    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:46.279827    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:46.777445    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:46.777460    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:46.777466    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:46.777469    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:46.779643    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:47.278055    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:47.278120    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:47.278134    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:47.278141    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:47.281004    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:47.281121    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:47.778899    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:47.778923    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:47.778933    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:47.779001    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:47.781920    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:48.278094    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:48.278140    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:48.278148    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:48.278153    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:48.280253    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:48.776917    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:48.776935    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:48.776947    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:48.776954    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:48.779870    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:49.277147    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:49.277168    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:49.277179    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:49.277186    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:49.279804    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:49.778489    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:49.778501    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:49.778508    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:49.778510    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:49.780670    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:49.780731    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:50.278221    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:50.278248    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:50.278302    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:50.278312    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:50.281268    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:50.777610    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:50.777650    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:50.777663    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:50.777672    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:50.780328    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:51.276863    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:51.276878    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:51.276884    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:51.276887    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:51.278829    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:51.778792    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:51.778815    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:51.778829    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:51.778836    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:51.782105    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:51.782175    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:52.277513    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:52.277532    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:52.277544    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:52.277550    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:52.280450    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:52.778374    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:52.778390    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:52.778396    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:52.778399    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:52.780416    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:53.277640    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:53.277659    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:53.277671    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:53.277677    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:53.280752    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:53.778971    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:53.779023    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:53.779036    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:53.779044    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:53.782509    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:53.782591    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:54.277469    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:54.277484    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:54.277491    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:54.277495    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:54.279585    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:54.778653    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:54.778675    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:54.778688    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:54.778708    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:54.781762    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:55.277208    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:55.277222    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:55.277263    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:55.277270    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:55.279152    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:55.777066    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:55.777102    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:55.777110    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:55.777115    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:55.779288    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:56.278230    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:56.278241    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:56.278248    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:56.278251    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:56.280389    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:56.280448    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:56.778057    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:56.778137    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:56.778151    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:56.778158    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:56.781449    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:57.277127    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:57.277141    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:57.277148    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:57.277151    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:57.279114    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:57.778467    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:57.778478    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:57.778485    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:57.778487    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:57.780611    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:58.277035    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:58.277048    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:58.277064    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:58.277069    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:58.284343    3744 round_trippers.go:574] Response Status: 404 Not Found in 7 milliseconds
	I0831 15:39:58.284416    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:58.778691    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:58.778707    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:58.778714    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:58.778718    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:58.780664    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:59.277786    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:59.277801    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:59.277810    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:59.277815    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:59.280162    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:59.777363    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:59.777389    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:59.777400    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:59.777417    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:59.780437    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:00.278216    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:00.278231    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:00.278238    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:00.278241    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:00.280398    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:00.777947    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:00.777973    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:00.777985    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:00.777992    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:00.780895    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:00.780963    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:01.277061    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:01.277081    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:01.277097    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:01.277105    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:01.280071    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:01.778574    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:01.778590    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:01.778596    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:01.778599    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:01.780602    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:02.277039    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:02.277051    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:02.277057    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:02.277060    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:02.279367    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:02.777088    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:02.777113    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:02.777124    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:02.777130    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:02.780010    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:03.277129    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:03.277143    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:03.277150    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:03.277155    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:03.279360    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:03.279419    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:03.779084    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:03.779111    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:03.779120    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:03.779131    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:03.782578    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:04.279084    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:04.279114    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:04.279193    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:04.279209    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:04.282418    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:04.777281    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:04.777294    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:04.777300    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:04.777304    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:04.779400    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:05.277304    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:05.277357    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:05.277370    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:05.277378    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:05.280048    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:05.280120    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:05.777871    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:05.777897    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:05.777908    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:05.777914    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:05.781308    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:06.278341    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:06.278357    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:06.278365    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:06.278369    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:06.280721    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:06.777260    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:06.777278    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:06.777313    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:06.777319    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:06.779441    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:07.277289    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:07.277354    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:07.277368    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:07.277376    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:07.280642    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:07.280705    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:07.777567    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:07.777583    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:07.777589    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:07.777591    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:07.779882    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:08.277957    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:08.277972    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:08.277980    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:08.277985    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:08.280280    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:08.777495    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:08.777523    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:08.777535    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:08.777541    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:08.780074    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:09.278397    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:09.278413    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:09.278419    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:09.278422    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:09.280703    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:09.280789    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:09.778365    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:09.778379    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:09.778388    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:09.778392    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:09.780762    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:10.277879    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:10.277891    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:10.277897    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:10.277900    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:10.279957    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:10.777727    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:10.777743    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:10.777749    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:10.777752    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:10.779982    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:11.277869    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:11.277892    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:11.277908    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:11.277916    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:11.281007    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:11.281122    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:11.777300    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:11.777325    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:11.777375    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:11.777385    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:11.780070    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:12.277419    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:12.277438    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:12.277444    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:12.277450    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:12.279959    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:12.778536    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:12.778559    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:12.778570    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:12.778577    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:12.782121    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:13.277460    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:13.277540    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:13.277558    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:13.277567    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:13.280462    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:13.777386    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:13.777406    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:13.777417    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:13.777423    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:13.779721    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:13.779795    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:14.278352    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:14.278373    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:14.278382    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:14.278386    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:14.280995    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:14.777911    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:14.777931    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:14.777944    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:14.777953    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:14.780609    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:15.277390    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:15.277406    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:15.277413    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:15.277418    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:15.279552    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:15.777171    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:15.777196    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:15.777208    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:15.777213    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:15.780616    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:15.780690    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:16.278393    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:16.278413    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:16.278423    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:16.278431    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:16.281087    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:16.777493    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:16.777505    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:16.777511    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:16.777514    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:16.779511    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:17.277948    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:17.277963    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:17.277971    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:17.277975    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:17.281263    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:17.777610    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:17.777635    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:17.777645    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:17.777652    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:17.780711    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:17.780810    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:18.278407    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:18.278427    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:18.278438    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:18.278445    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:18.280714    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:18.778225    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:18.778250    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:18.778258    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:18.778263    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:18.781566    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:19.278245    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:19.278271    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:19.278341    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:19.278351    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:19.281708    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:19.778206    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:19.778220    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:19.778226    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:19.778231    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:19.780309    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:20.277705    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:20.277724    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:20.277735    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:20.277743    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:20.280797    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:20.280880    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:20.777518    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:20.777542    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:20.777554    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:20.777562    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:20.780637    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:21.277649    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:21.277665    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:21.277671    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:21.277675    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:21.280074    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:21.778048    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:21.778072    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:21.778084    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:21.778090    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:21.781448    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:22.277500    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:22.277519    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:22.277530    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:22.277535    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:22.280641    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:22.778428    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:22.778446    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:22.778455    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:22.778461    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:22.780933    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:22.780991    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:23.277541    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:23.277605    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:23.277620    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:23.277627    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:23.280957    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:23.777433    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:23.777447    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:23.777454    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:23.777457    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:23.779506    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:24.277362    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:24.277385    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:24.277433    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:24.277440    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:24.280068    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:24.778081    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:24.778099    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:24.778111    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:24.778117    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:24.781146    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:24.781249    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:25.278144    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:25.278167    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:25.278178    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:25.278185    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:25.281087    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:25.778478    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:25.778499    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:25.778540    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:25.778545    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:25.780863    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:26.277292    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:26.277320    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:26.277335    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:26.277342    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:26.280115    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:26.777557    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:26.777573    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:26.777581    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:26.777585    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:26.779974    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:27.278458    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:27.278474    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:27.278481    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:27.278484    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:27.280521    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:27.280595    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:27.777967    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:27.777987    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:27.777996    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:27.778001    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:27.780485    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:28.277807    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:28.277826    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:28.277838    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:28.277846    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:28.280427    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:28.777498    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:28.777510    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:28.777516    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:28.777520    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:28.779847    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:29.277964    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:29.277985    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:29.277996    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:29.278002    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:29.280815    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:29.280906    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:29.778537    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:29.778559    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:29.778570    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:29.778575    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:29.781623    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:30.277396    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:30.277412    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:30.277420    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:30.277424    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:30.279862    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:30.778701    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:30.778785    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:30.778800    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:30.778808    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:30.781829    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:31.278707    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:31.278727    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:31.278738    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:31.278744    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:31.281658    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:31.281726    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:31.778169    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:31.778189    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:31.778199    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:31.778205    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:31.781541    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:32.277415    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:32.277446    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:32.277488    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:32.277498    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:32.280928    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:32.777636    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:32.777722    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:32.777736    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:32.777743    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:32.780331    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:33.278774    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:33.278793    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:33.278802    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:33.278807    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:33.281266    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:33.778581    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:33.778604    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:33.778615    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:33.778622    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:33.781819    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:33.781931    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:34.278488    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:34.278512    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:34.278538    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:34.278546    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:34.281635    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:34.777686    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:34.777700    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:34.777708    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:34.777713    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:34.780113    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:35.277895    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:35.277919    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:35.277930    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:35.277935    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:35.281263    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:35.777425    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:35.777449    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:35.777467    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:35.777477    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:35.780717    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:36.279317    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:36.279363    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:36.279373    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:36.279381    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:36.282024    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:36.282088    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:36.777443    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:36.777459    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:36.777468    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:36.777473    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:36.779899    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:37.278285    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:37.278300    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:37.278306    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:37.278311    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:37.280691    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:37.778439    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:37.778466    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:37.778477    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:37.778484    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:37.781678    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:38.279008    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:38.279038    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:38.279051    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:38.279059    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:38.282603    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:38.282694    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:38.778818    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:38.778844    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:38.778855    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:38.778861    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:38.783197    3744 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:40:39.278660    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:39.278672    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:39.278678    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:39.278681    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:39.280786    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:39.777503    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:39.777522    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:39.777535    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:39.777541    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:39.780544    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:40.278292    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:40.278317    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:40.278329    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:40.278337    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:40.281137    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:40.778006    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:40.778032    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:40.778057    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:40.778071    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:40.781057    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:40.781158    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:41.278405    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:41.278469    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:41.278519    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:41.278533    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:41.281715    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:41.777417    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:41.777432    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:41.777438    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:41.777441    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:41.779462    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:42.278930    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:42.278962    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:42.278969    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:42.278974    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:42.280885    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:42.778654    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:42.778673    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:42.778708    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:42.778714    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:42.781210    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:42.781277    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:43.277428    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:43.277444    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:43.277450    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:43.277454    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:43.279641    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:43.778230    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:43.778243    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:43.778248    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:43.778252    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:43.780641    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:44.278516    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:44.278536    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:44.278545    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:44.278550    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:44.280826    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:44.777524    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:44.777543    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:44.777554    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:44.777560    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:44.780897    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:45.279411    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:45.279427    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:45.279433    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:45.279436    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:45.281622    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:45.281684    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:45.779055    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:45.779071    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:45.779078    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:45.779081    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:45.780982    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:46.278769    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:46.278788    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:46.278794    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:46.278799    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:46.280873    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:46.779140    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:46.779158    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:46.779191    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:46.779195    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:46.781223    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:47.277666    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:47.277689    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:47.277725    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:47.277732    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:47.280012    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:47.778363    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:47.778385    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:47.778394    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:47.778399    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:47.780853    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:47.780917    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:48.277895    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:48.277909    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:48.277915    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:48.277917    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:48.279760    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:48.778443    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:48.778469    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:48.778480    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:48.778487    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:48.782101    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:49.278898    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:49.278941    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:49.278948    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:49.278952    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:49.280953    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:49.779196    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:49.779209    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:49.779218    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:49.779222    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:49.781778    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:49.781836    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:50.277693    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:50.277708    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:50.277717    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:50.277721    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:50.279726    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:50.778035    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:50.778058    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:50.778070    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:50.778079    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:50.781019    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:51.277510    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:51.277549    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:51.277564    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:51.277567    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:51.279483    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:51.779060    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:51.779084    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:51.779113    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:51.779118    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:51.781564    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:52.278175    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:52.278187    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:52.278193    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:52.278197    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:52.280098    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:52.280167    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:52.778702    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:52.778717    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:52.778726    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:52.778730    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:52.781143    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:53.278862    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:53.278918    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:53.278925    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:53.278930    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:53.281004    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:53.779375    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:53.779401    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:53.779412    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:53.779418    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:53.783259    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:54.279308    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:54.279324    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:54.279331    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:54.279334    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:54.281428    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:54.281496    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:54.778158    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:54.778177    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:54.778191    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:54.778198    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:54.781197    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:55.277649    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:55.277663    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:55.277668    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:55.277672    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:55.279760    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:55.779263    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:55.779320    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:55.779330    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:55.779335    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:55.781789    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:56.278112    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:56.278124    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:56.278129    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:56.278133    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:56.280134    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:56.777990    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:56.778010    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:56.778022    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:56.778032    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:56.781213    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:56.781297    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:57.279143    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:57.279158    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:57.279164    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:57.279168    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:57.280873    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:57.778857    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:57.778881    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:57.778893    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:57.778900    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:57.781501    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:58.278300    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:58.278312    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:58.278318    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:58.278324    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:58.280252    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:58.779105    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:58.779139    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:58.779146    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:58.779150    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:58.780953    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:59.277772    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:59.277810    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:59.277817    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:59.277821    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:59.279828    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:59.279892    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:59.778306    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:59.778323    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:59.778334    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:59.778339    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:59.781562    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:00.279133    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:00.279147    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:00.279154    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:00.279157    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:00.280989    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:00.777674    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:00.777695    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:00.777706    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:00.777714    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:00.780625    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:01.278700    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:01.278712    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:01.278718    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:01.278722    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:01.280680    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:01.280741    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:01.778674    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:01.778695    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:01.778704    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:01.778709    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:01.781227    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:02.278744    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:02.278759    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:02.278764    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:02.278767    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:02.280603    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:02.778554    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:02.778581    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:02.778646    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:02.778654    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:02.781844    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:03.277956    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:03.277979    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:03.277986    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:03.277990    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:03.279854    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:03.777682    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:03.777698    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:03.777704    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:03.777707    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:03.780161    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:03.780218    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:04.278517    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:04.278529    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:04.278535    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:04.278537    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:04.280362    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:04.778969    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:04.778980    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:04.778986    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:04.778989    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:04.782704    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:05.278152    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:05.278165    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:05.278170    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:05.278173    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:05.280542    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:05.778218    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:05.778303    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:05.778320    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:05.778329    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:05.781501    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:05.781585    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:06.277766    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:06.277778    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:06.277784    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:06.277787    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:06.279880    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:06.778001    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:06.778021    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:06.778033    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:06.778039    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:06.781121    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:07.278457    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:07.278468    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:07.278474    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:07.278478    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:07.280352    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:07.778000    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:07.778020    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:07.778031    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:07.778037    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:07.781054    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:08.277960    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:08.277972    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:08.277978    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:08.277982    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:08.279721    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:08.279790    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:08.777988    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:08.778006    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:08.778014    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:08.778019    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:08.780429    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:09.278866    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:09.278887    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:09.278894    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:09.278898    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:09.280928    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:09.777942    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:09.777961    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:09.777972    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:09.777978    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:09.781177    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:10.279287    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:10.279340    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:10.279352    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:10.279358    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:10.281250    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:10.281309    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:10.779851    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:10.779871    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:10.779883    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:10.779888    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:10.783174    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:11.279199    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:11.279213    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:11.279219    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:11.279223    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:11.281075    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:11.778332    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:11.778350    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:11.778359    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:11.778363    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:11.780504    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:12.279627    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:12.279653    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:12.279665    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:12.279671    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:12.282520    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:12.282615    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:12.778391    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:12.778412    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:12.778424    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:12.778428    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:12.781446    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:13.278111    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:13.278123    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:13.278129    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:13.278132    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:13.279887    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:13.779218    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:13.779233    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:13.779239    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:13.779242    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:13.781352    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:14.277806    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:14.277818    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:14.277823    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:14.277827    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:14.279913    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:14.779779    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:14.779797    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:14.779808    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:14.779814    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:14.783141    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:14.783269    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:15.278003    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:15.278017    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:15.278023    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:15.278027    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:15.279730    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:15.778699    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:15.778720    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:15.778731    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:15.778737    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:15.781939    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:16.278805    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:16.278818    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:16.278846    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:16.278851    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:16.280696    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:16.778278    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:16.778298    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:16.778307    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:16.778312    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:16.780692    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:17.278010    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:17.278061    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:17.278071    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:17.278075    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:17.280183    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:17.280244    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:17.779658    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:17.779684    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:17.779696    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:17.779703    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:17.782964    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:18.279131    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:18.279146    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:18.279152    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:18.279155    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:18.281127    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:18.778591    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:18.778613    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:18.778624    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:18.778631    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:18.781947    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:19.278144    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:19.278156    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:19.278162    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:19.278165    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:19.280314    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:19.280374    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:19.779309    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:19.779328    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:19.779339    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:19.779346    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:19.782226    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:20.278897    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:20.278909    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:20.278914    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:20.278917    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:20.280839    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:20.779038    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:20.779071    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:20.779085    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:20.779095    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:20.782073    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:21.278315    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:21.278364    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:21.278371    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:21.278376    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:21.280407    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:21.280468    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:21.778122    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:21.778137    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:21.778144    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:21.778146    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:21.780207    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:22.278547    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:22.278561    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:22.278568    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:22.278571    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:22.280976    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:22.778009    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:22.778029    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:22.778040    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:22.778045    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:22.780889    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:23.278954    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:23.278999    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:23.279008    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:23.279011    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:23.283528    3744 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:41:23.283590    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:23.779486    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:23.779512    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:23.779523    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:23.779536    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:23.782922    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:24.277863    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:24.277876    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:24.277882    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:24.277885    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:24.279860    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:24.779167    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:24.779185    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:24.779196    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:24.779202    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:24.782106    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:25.279100    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:25.279120    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:25.279131    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:25.279139    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:25.282042    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:25.778565    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:25.778640    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:25.778655    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:25.778663    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:25.781719    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:25.781792    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:26.279146    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:26.279182    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:26.279223    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:26.279229    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:26.282148    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:26.778592    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:26.778614    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:26.778626    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:26.778632    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:26.782054    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:27.278821    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:27.278835    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:27.278842    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:27.278845    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:27.281364    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:27.778073    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:27.778100    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:27.778118    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:27.778125    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:27.781324    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:28.277935    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:28.277959    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:28.278022    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:28.278033    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:28.281297    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:28.281465    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:28.778608    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:28.778635    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:28.778648    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:28.778656    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:28.781848    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:29.278110    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:29.278132    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:29.278143    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:29.278148    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:29.281146    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:29.778251    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:29.778265    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:29.778273    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:29.778277    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:29.782398    3744 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:41:30.279687    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:30.279700    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:30.279708    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:30.279712    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:30.282090    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:30.282159    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:30.779599    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:30.779624    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:30.779636    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:30.779642    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:30.783210    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:31.279353    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:31.279366    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:31.279372    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:31.279376    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:31.281276    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:31.779671    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:31.779692    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:31.779704    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:31.779709    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:31.781611    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:32.279371    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:32.279395    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:32.279435    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:32.279442    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:32.282259    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:32.282329    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:32.779427    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:32.779446    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:32.779458    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:32.779463    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:32.782235    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:33.279452    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:33.279465    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:33.279471    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:33.279474    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:33.281321    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:33.778052    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:33.778072    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:33.778083    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:33.778089    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:33.781878    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:34.278548    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:34.278567    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:34.278575    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:34.278584    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:34.281417    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:34.779174    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:34.779193    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:34.779205    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:34.779210    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:34.782050    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:34.782115    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:35.278139    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:35.278152    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:35.278158    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:35.278162    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:35.279993    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:35.779363    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:35.779450    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:35.779465    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:35.779473    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:35.782313    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:36.278357    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:36.278383    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:36.278394    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:36.278400    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:36.281375    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:36.778762    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:36.778799    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:36.778808    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:36.778814    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:36.780954    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:37.279993    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:37.280053    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:37.280067    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:37.280075    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:37.282945    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:37.283021    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:37.779739    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:37.779783    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:37.779790    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:37.779796    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:37.781629    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:38.279147    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:38.279171    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:38.279184    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:38.279190    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:38.281843    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:38.778741    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:38.778764    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:38.778814    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:38.778819    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:38.781350    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:39.279360    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:39.279391    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:39.279399    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:39.279405    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:39.281151    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:39.778714    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:39.778733    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:39.778744    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:39.778752    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:39.781665    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:39.781800    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:40.278379    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:40.278393    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:40.278401    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:40.278405    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:40.280809    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:40.779343    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:40.779384    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:40.779392    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:40.779398    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:40.781388    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:41.279463    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:41.279490    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:41.279503    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:41.279508    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:41.282590    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:41.779242    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:41.779260    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:41.779267    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:41.779272    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:41.781369    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:42.279466    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:42.279483    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:42.279489    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:42.279492    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:42.281217    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:42.281311    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:42.778084    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:42.778101    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:42.778109    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:42.778112    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:42.780674    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:43.279061    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:43.279078    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:43.279088    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:43.279093    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:43.281059    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:43.779095    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:43.779129    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:43.779136    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:43.779138    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:43.781068    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:44.279029    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:44.279048    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:44.279058    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:44.279063    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:44.281431    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:44.281488    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:44.779540    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:44.779553    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:44.779562    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:44.779566    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:44.782120    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:45.278415    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:45.278429    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:45.278440    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:45.278444    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:45.280960    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:45.778255    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:45.778298    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:45.778305    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:45.778309    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:45.780347    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:46.279010    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:46.279030    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:46.279041    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:46.279046    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:46.282148    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:46.282239    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:46.779747    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:46.779768    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:46.779776    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:46.779782    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:46.782151    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:47.278274    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:47.278298    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:47.278339    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:47.278345    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:47.280731    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:47.778365    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:47.778390    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:47.778399    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:47.778408    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:47.781184    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:48.279756    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:48.279775    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:48.279785    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:48.279789    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:48.282380    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:48.282440    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:48.780165    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:48.780186    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:48.780197    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:48.780205    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:48.783195    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:49.278649    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:49.278669    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:49.278680    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:49.278685    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:49.281793    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:49.780041    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:49.780056    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:49.780064    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:49.780069    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:49.782464    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:50.278528    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:50.278538    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:50.278545    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:50.278549    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:50.280284    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:50.778556    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:50.778582    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:50.778591    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:50.778596    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:50.781794    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:50.781879    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:51.278412    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:51.278448    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:51.278456    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:51.278459    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:51.280359    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:51.778770    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:51.778852    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:51.778867    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:51.778876    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:51.781710    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:52.279069    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:52.279089    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:52.279101    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:52.279107    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:52.282688    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:52.778612    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:52.778627    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:52.778634    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:52.778636    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:52.780790    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:53.278839    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:53.278918    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:53.278932    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:53.278939    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:53.281791    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:53.281864    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:53.778930    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:53.778944    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:53.778953    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:53.778998    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:53.780750    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:54.279141    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:54.279163    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:54.279202    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:54.279208    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:54.281212    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:54.780288    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:54.780307    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:54.780318    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:54.780326    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:54.783446    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:55.278636    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:55.278655    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:55.278669    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:55.278675    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:55.281304    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:55.778487    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:55.778506    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:55.778513    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:55.778517    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:55.780794    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:55.780852    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:56.279529    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:56.279542    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:56.279548    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:56.279552    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:56.281403    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:56.779390    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:56.779415    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:56.779427    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:56.779435    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:56.782652    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:57.279730    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:57.279749    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:57.279775    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:57.279778    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:57.282199    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:57.778341    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:57.778353    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:57.778360    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:57.778364    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:57.780339    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:58.280180    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:58.280200    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:58.280208    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:58.280212    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:58.283270    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:58.283333    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:58.778922    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:58.778934    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:58.778941    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:58.778944    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:58.781165    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:59.278658    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:59.278670    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:59.278677    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:59.278680    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:59.280526    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:59.780251    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:59.780269    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:59.780278    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:59.780285    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:59.783254    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:00.278299    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:00.278311    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:00.278318    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:00.278321    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:00.280462    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:00.778333    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:00.778357    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:00.778417    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:00.778425    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:00.781396    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:00.781503    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:01.279261    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:01.279281    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:01.279292    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:01.279299    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:01.282233    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:01.778447    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:01.778464    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:01.778472    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:01.778476    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:01.780643    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:02.278526    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:02.278545    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:02.278557    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:02.278563    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:02.281443    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:02.778669    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:02.778693    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:02.778704    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:02.778709    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:02.782028    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:02.782104    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:03.278662    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:03.278675    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:03.278681    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:03.278684    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:03.281034    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:03.779554    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:03.779600    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:03.779611    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:03.779619    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:03.782537    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:04.278499    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:04.278522    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:04.278534    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:04.278542    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:04.281683    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:04.779122    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:04.779133    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:04.779140    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:04.779143    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:04.781151    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:42:05.279493    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:05.279515    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:05.279527    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:05.279535    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:05.283494    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:05.283569    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:05.779088    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:05.779167    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:05.779181    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:05.779187    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:05.782371    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:06.279314    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:06.279355    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:06.279363    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:06.279369    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:06.281532    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:06.779431    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:06.779454    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:06.779465    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:06.779473    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:06.782521    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:07.279058    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:07.279070    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:07.279078    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:07.279083    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:07.281403    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:07.780066    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:07.780081    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:07.780088    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:07.780092    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:07.782413    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:07.782477    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:08.278582    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:08.278601    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:08.278612    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:08.278617    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:08.281655    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:08.779457    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:08.779482    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:08.779494    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:08.779500    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:08.782874    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:09.278624    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:09.278660    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:09.278667    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:09.278671    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:09.280685    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:09.780183    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:09.780196    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:09.780204    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:09.780208    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:09.782479    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:09.782566    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:10.279033    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:10.279051    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:10.279074    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:10.279077    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:10.281035    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:42:10.778903    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:10.778916    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:10.778923    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:10.778926    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:10.781070    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:11.279519    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:11.279545    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:11.279587    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:11.279593    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:11.281932    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:11.780008    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:11.780026    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:11.780035    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:11.780041    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:11.782405    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:12.279415    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:12.279432    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:12.279438    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:12.279441    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:12.283584    3744 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:42:12.283720    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:12.779135    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:12.779163    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:12.779182    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:12.779192    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:12.782385    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:13.279985    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:13.280010    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:13.280074    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:13.280083    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:13.287032    3744 round_trippers.go:574] Response Status: 404 Not Found in 6 milliseconds
	I0831 15:42:13.778812    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:13.778824    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:13.778832    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:13.778837    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:13.780875    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:14.278421    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:14.278446    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:14.278459    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:14.278468    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:14.281130    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:14.778988    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:14.779006    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:14.779017    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:14.779025    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:14.782314    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:14.782385    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:15.279457    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:15.279477    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:15.279486    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:15.279492    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:15.281789    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:15.779420    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:15.779447    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:15.779459    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:15.779465    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:15.782822    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:16.278493    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:16.278512    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:16.278521    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:16.278526    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:16.280744    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:16.779399    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:16.779415    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:16.779421    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:16.779424    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:16.781497    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:17.279997    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:17.280026    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:17.280038    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:17.280046    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:17.283600    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:17.283684    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:17.778578    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:17.778593    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:17.778641    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:17.778645    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:17.780800    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:18.279627    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:18.279643    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:18.279650    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:18.279653    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:18.281669    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:18.778603    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:18.778615    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:18.778621    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:18.778625    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:18.781667    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:19.279738    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:19.279765    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:19.279777    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:19.279785    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:19.282926    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:19.779767    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:19.779781    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:19.779788    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:19.779791    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:19.781778    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:42:19.781867    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:19.781882    3744 node_ready.go:38] duration metric: took 4m0.003563812s for node "ha-949000-m04" to be "Ready" ...
	I0831 15:42:19.812481    3744 out.go:201] 
	W0831 15:42:19.833493    3744 out.go:270] X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	W0831 15:42:19.833512    3744 out.go:270] * 
	W0831 15:42:19.834711    3744 out.go:293] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0831 15:42:19.917735    3744 out.go:201] 
	
	
	==> Docker <==
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.219890563Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.220222326Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:37:15 ha-949000 cri-dockerd[1422]: time="2024-08-31T22:37:15Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/b2a8128cbfc292835f200d6551b039f9078ca4bc34012a439cb84e9977fa736b/resolv.conf as [nameserver 192.169.0.1]"
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.321266510Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.321331709Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.321344565Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.321411223Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:37:15 ha-949000 cri-dockerd[1422]: time="2024-08-31T22:37:15Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/eb9132907eda4d53e71edd7c7c0cba6cb88a38299639a216ab3394c1ee636b08/resolv.conf as [nameserver 192.169.0.1]"
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.533698709Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.533801876Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.533841143Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.535981528Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:37:15 ha-949000 cri-dockerd[1422]: time="2024-08-31T22:37:15Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/88b8aff8a006d67d53ddbefdb7171c2dba6d6b8082457d8b875b0980fe0a3f82/resolv.conf as [nameserver 10.96.0.10 search default.svc.cluster.local svc.cluster.local cluster.local options ndots:5]"
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.781886172Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.781967190Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.782044910Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.782180434Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:37:45 ha-949000 dockerd[1168]: time="2024-08-31T22:37:45.904555766Z" level=info msg="ignoring event" container=c7ade311e2b6bcc0e1f37e83b236eaec5caafb139b65d92f8114faaed4aacb77 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Aug 31 22:37:45 ha-949000 dockerd[1175]: time="2024-08-31T22:37:45.905026545Z" level=info msg="shim disconnected" id=c7ade311e2b6bcc0e1f37e83b236eaec5caafb139b65d92f8114faaed4aacb77 namespace=moby
	Aug 31 22:37:45 ha-949000 dockerd[1175]: time="2024-08-31T22:37:45.905076623Z" level=warning msg="cleaning up after shim disconnected" id=c7ade311e2b6bcc0e1f37e83b236eaec5caafb139b65d92f8114faaed4aacb77 namespace=moby
	Aug 31 22:37:45 ha-949000 dockerd[1175]: time="2024-08-31T22:37:45.905085418Z" level=info msg="cleaning up dead shim" namespace=moby
	Aug 31 22:37:58 ha-949000 dockerd[1175]: time="2024-08-31T22:37:58.377002915Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:37:58 ha-949000 dockerd[1175]: time="2024-08-31T22:37:58.377073590Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:37:58 ha-949000 dockerd[1175]: time="2024-08-31T22:37:58.377087074Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:37:58 ha-949000 dockerd[1175]: time="2024-08-31T22:37:58.377452368Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                 CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	9743646580e07       6e38f40d628db                                                                                         4 minutes ago       Running             storage-provisioner       2                   e485647500358       storage-provisioner
	f5deb862745e4       8c811b4aec35f                                                                                         5 minutes ago       Running             busybox                   1                   88b8aff8a006d       busybox-7dff88458-5kkbw
	f89b862064139       ad83b2ca7b09e                                                                                         5 minutes ago       Running             kube-proxy                1                   eb9132907eda4       kube-proxy-q7ndn
	ac487ac32c364       cbb01a7bd410d                                                                                         5 minutes ago       Running             coredns                   1                   b2a8128cbfc29       coredns-6f6b679f8f-snq8s
	ff98d7e38a1e6       12968670680f4                                                                                         5 minutes ago       Running             kindnet-cni               1                   fc1aa95e54f86       kindnet-jzj42
	c4dc6059b2150       cbb01a7bd410d                                                                                         5 minutes ago       Running             coredns                   1                   9b710526ef4f9       coredns-6f6b679f8f-kjszm
	c7ade311e2b6b       6e38f40d628db                                                                                         5 minutes ago       Exited              storage-provisioner       1                   e485647500358       storage-provisioner
	3dd9e3bd3e1f5       045733566833c                                                                                         5 minutes ago       Running             kube-controller-manager   2                   5f88515d4139e       kube-controller-manager-ha-949000
	5b0ac6b7faf7d       1766f54c897f0                                                                                         5 minutes ago       Running             kube-scheduler            1                   6e330e66cf27f       kube-scheduler-ha-949000
	fa476ce36b900       604f5db92eaa8                                                                                         5 minutes ago       Running             kube-apiserver            1                   05f6f2cfbf46d       kube-apiserver-ha-949000
	2255978551ea3       2e96e5913fc06                                                                                         5 minutes ago       Running             etcd                      1                   d62930734f2f9       etcd-ha-949000
	740de9cc660e2       045733566833c                                                                                         5 minutes ago       Exited              kube-controller-manager   1                   5f88515d4139e       kube-controller-manager-ha-949000
	0bb147eb5f408       38af8ddebf499                                                                                         5 minutes ago       Running             kube-vip                  0                   9ac139ab4844d       kube-vip-ha-949000
	2f925f16b74b0       gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12   9 minutes ago       Exited              busybox                   0                   f68483c946835       busybox-7dff88458-5kkbw
	b1db836cd7a3d       cbb01a7bd410d                                                                                         12 minutes ago      Exited              coredns                   0                   271da20951c9a       coredns-6f6b679f8f-kjszm
	def4d6bd20bc5       cbb01a7bd410d                                                                                         12 minutes ago      Exited              coredns                   0                   1017bd5eac1d2       coredns-6f6b679f8f-snq8s
	6d156ce626115       kindest/kindnetd@sha256:e59a687ca28ae274a2fc92f1e2f5f1c739f353178a43a23aafc71adb802ed166              12 minutes ago      Exited              kindnet-cni               0                   7d1851c17485c       kindnet-jzj42
	54d5f8041c89d       ad83b2ca7b09e                                                                                         12 minutes ago      Exited              kube-proxy                0                   4b0198ac7dc52       kube-proxy-q7ndn
	c734c23a53082       2e96e5913fc06                                                                                         12 minutes ago      Exited              etcd                      0                   7cfaf9f5d4dd4       etcd-ha-949000
	02c10e4f765d1       1766f54c897f0                                                                                         12 minutes ago      Exited              kube-scheduler            0                   c084f2a259f6c       kube-scheduler-ha-949000
	ffec6106be6c8       604f5db92eaa8                                                                                         12 minutes ago      Exited              kube-apiserver            0                   25c49852f78dc       kube-apiserver-ha-949000
	
	
	==> coredns [ac487ac32c36] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 257e111468ef6f1e36f10df061303186c353cd0e51aed8f50f4e4fd21cec02687aef97084fe1f82262f5cee88179d311670a6ae21ae185759728216fc264125f
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:37668 - 17883 "HINFO IN 4931414995021238036.4254872758042696539. udp 57 false 512" NXDOMAIN qr,rd,ra 132 0.026863898s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1645472327]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.837) (total time: 30003ms):
	Trace[1645472327]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30002ms (22:37:45.839)
	Trace[1645472327]: [30.003429832s] [30.003429832s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[2054948566]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.838) (total time: 30003ms):
	Trace[2054948566]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30003ms (22:37:45.841)
	Trace[2054948566]: [30.003549662s] [30.003549662s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[850581595]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.840) (total time: 30001ms):
	Trace[850581595]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (22:37:45.841)
	Trace[850581595]: [30.001289039s] [30.001289039s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [b1db836cd7a3] <==
	[INFO] 10.244.1.2:58757 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000418868s
	[INFO] 10.244.1.2:39299 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000067106s
	[INFO] 10.244.2.2:56948 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000080585s
	[INFO] 10.244.2.2:56973 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000078985s
	[INFO] 10.244.2.2:43081 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000100123s
	[INFO] 10.244.2.2:56390 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000040214s
	[INFO] 10.244.2.2:52519 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000061255s
	[INFO] 10.244.0.4:36226 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000151133s
	[INFO] 10.244.1.2:44017 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089111s
	[INFO] 10.244.1.2:37224 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000069144s
	[INFO] 10.244.1.2:51282 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000118723s
	[INFO] 10.244.2.2:35009 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089507s
	[INFO] 10.244.2.2:60607 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000049176s
	[INFO] 10.244.2.2:36851 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000097758s
	[INFO] 10.244.0.4:59717 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000053986s
	[INFO] 10.244.0.4:58447 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000060419s
	[INFO] 10.244.1.2:60381 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000136898s
	[INFO] 10.244.1.2:32783 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.00010303s
	[INFO] 10.244.1.2:44904 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000042493s
	[INFO] 10.244.1.2:44085 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000132084s
	[INFO] 10.244.2.2:43635 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000080947s
	[INFO] 10.244.2.2:40020 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000081919s
	[INFO] 10.244.2.2:53730 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000058015s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> coredns [c4dc6059b215] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 257e111468ef6f1e36f10df061303186c353cd0e51aed8f50f4e4fd21cec02687aef97084fe1f82262f5cee88179d311670a6ae21ae185759728216fc264125f
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:55597 - 61955 "HINFO IN 5411809642052316829.545085282119266902. udp 56 false 512" NXDOMAIN qr,rd,ra 131 0.026601414s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1248174265]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.837) (total time: 30003ms):
	Trace[1248174265]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30002ms (22:37:45.839)
	Trace[1248174265]: [30.003765448s] [30.003765448s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[313955954]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.840) (total time: 30001ms):
	Trace[313955954]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (22:37:45.841)
	Trace[313955954]: [30.001623019s] [30.001623019s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1099528094]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.837) (total time: 30004ms):
	Trace[1099528094]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30004ms (22:37:45.842)
	Trace[1099528094]: [30.004679878s] [30.004679878s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [def4d6bd20bc] <==
	[INFO] 10.244.1.2:55576 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.000574417s
	[INFO] 10.244.1.2:36293 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000065455s
	[INFO] 10.244.2.2:41223 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000063892s
	[INFO] 10.244.0.4:54135 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000096141s
	[INFO] 10.244.0.4:39176 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000742646s
	[INFO] 10.244.0.4:58445 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000080113s
	[INFO] 10.244.0.4:56242 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000066269s
	[INFO] 10.244.0.4:60657 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000049645s
	[INFO] 10.244.1.2:48306 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000561931s
	[INFO] 10.244.1.2:40767 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000077826s
	[INFO] 10.244.1.2:35669 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000056994s
	[INFO] 10.244.1.2:57720 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000040565s
	[INFO] 10.244.2.2:38794 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000136901s
	[INFO] 10.244.2.2:33576 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000052374s
	[INFO] 10.244.2.2:57053 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000051289s
	[INFO] 10.244.0.4:47623 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000056903s
	[INFO] 10.244.0.4:59818 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.00003011s
	[INFO] 10.244.0.4:53586 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000029565s
	[INFO] 10.244.1.2:60045 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000060878s
	[INFO] 10.244.2.2:38400 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000078624s
	[INFO] 10.244.0.4:58765 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000075707s
	[INFO] 10.244.0.4:32804 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000050785s
	[INFO] 10.244.2.2:48459 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.00007773s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> describe nodes <==
	Name:               ha-949000
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-949000
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=ha-949000
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_08_31T15_29_45_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:29:41 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-949000
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:42:12 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:42:12 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:42:12 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:42:12 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:42:12 +0000   Sat, 31 Aug 2024 22:37:06 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.5
	  Hostname:    ha-949000
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 199c42a1ef3943388f047673dca52741
	  System UUID:                98ca49d1-0000-0000-9e6c-321a4533d56e
	  Boot ID:                    ede31f27-0dff-4107-9a48-7cb2c0328412
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-5kkbw              0 (0%)        0 (0%)      0 (0%)           0 (0%)         10m
	  kube-system                 coredns-6f6b679f8f-kjszm             100m (5%)     0 (0%)      70Mi (3%)        170Mi (8%)     12m
	  kube-system                 coredns-6f6b679f8f-snq8s             100m (5%)     0 (0%)      70Mi (3%)        170Mi (8%)     12m
	  kube-system                 etcd-ha-949000                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         12m
	  kube-system                 kindnet-jzj42                        100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      12m
	  kube-system                 kube-apiserver-ha-949000             250m (12%)    0 (0%)      0 (0%)           0 (0%)         12m
	  kube-system                 kube-controller-manager-ha-949000    200m (10%)    0 (0%)      0 (0%)           0 (0%)         12m
	  kube-system                 kube-proxy-q7ndn                     0 (0%)        0 (0%)      0 (0%)           0 (0%)         12m
	  kube-system                 kube-scheduler-ha-949000             100m (5%)     0 (0%)      0 (0%)           0 (0%)         12m
	  kube-system                 kube-vip-ha-949000                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         5m8s
	  kube-system                 storage-provisioner                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         12m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%)   100m (5%)
	  memory             290Mi (13%)  390Mi (18%)
	  ephemeral-storage  0 (0%)       0 (0%)
	  hugepages-2Mi      0 (0%)       0 (0%)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 12m                    kube-proxy       
	  Normal  Starting                 5m6s                   kube-proxy       
	  Normal  Starting                 12m                    kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  12m                    kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  12m                    kubelet          Node ha-949000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    12m                    kubelet          Node ha-949000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     12m                    kubelet          Node ha-949000 status is now: NodeHasSufficientPID
	  Normal  RegisteredNode           12m                    node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  NodeReady                12m                    kubelet          Node ha-949000 status is now: NodeReady
	  Normal  RegisteredNode           11m                    node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           10m                    node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           7m53s                  node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  NodeHasSufficientMemory  5m54s (x8 over 5m54s)  kubelet          Node ha-949000 status is now: NodeHasSufficientMemory
	  Normal  Starting                 5m54s                  kubelet          Starting kubelet.
	  Normal  NodeHasNoDiskPressure    5m54s (x8 over 5m54s)  kubelet          Node ha-949000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     5m54s (x7 over 5m54s)  kubelet          Node ha-949000 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  5m54s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           5m23s                  node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           5m5s                   node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           4m39s                  node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	
	
	Name:               ha-949000-m02
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-949000-m02
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=ha-949000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_08_31T15_30_43_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:30:41 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-949000-m02
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:42:21 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:42:01 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:42:01 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:42:01 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:42:01 +0000   Sat, 31 Aug 2024 22:31:00 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.6
	  Hostname:    ha-949000-m02
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 86a1a86d2cdf4cba8c80d25d466d7a14
	  System UUID:                23e54f3d-0000-0000-86b7-b25c818528d1
	  Boot ID:                    eb3152fc-98b8-4334-9705-7b182a7d2f78
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-6r9s5                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         10m
	  kube-system                 etcd-ha-949000-m02                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         11m
	  kube-system                 kindnet-brtj6                            100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      11m
	  kube-system                 kube-apiserver-ha-949000-m02             250m (12%)    0 (0%)      0 (0%)           0 (0%)         11m
	  kube-system                 kube-controller-manager-ha-949000-m02    200m (10%)    0 (0%)      0 (0%)           0 (0%)         11m
	  kube-system                 kube-proxy-4r2bt                         0 (0%)        0 (0%)      0 (0%)           0 (0%)         11m
	  kube-system                 kube-scheduler-ha-949000-m02             100m (5%)     0 (0%)      0 (0%)           0 (0%)         11m
	  kube-system                 kube-vip-ha-949000-m02                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         11m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%)  100m (5%)
	  memory             150Mi (7%)  50Mi (2%)
	  ephemeral-storage  0 (0%)      0 (0%)
	  hugepages-2Mi      0 (0%)      0 (0%)
	Events:
	  Type     Reason                   Age                    From             Message
	  ----     ------                   ----                   ----             -------
	  Normal   Starting                 5m24s                  kube-proxy       
	  Normal   Starting                 7m56s                  kube-proxy       
	  Normal   Starting                 11m                    kube-proxy       
	  Normal   NodeAllocatableEnforced  11m                    kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  11m (x8 over 11m)      kubelet          Node ha-949000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    11m (x8 over 11m)      kubelet          Node ha-949000-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     11m (x7 over 11m)      kubelet          Node ha-949000-m02 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           11m                    node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   RegisteredNode           11m                    node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   RegisteredNode           10m                    node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   Starting                 8m                     kubelet          Starting kubelet.
	  Warning  Rebooted                 8m                     kubelet          Node ha-949000-m02 has been rebooted, boot id: 4ddbe4b0-7ef0-4715-a631-f977c123c463
	  Normal   NodeHasSufficientPID     8m                     kubelet          Node ha-949000-m02 status is now: NodeHasSufficientPID
	  Normal   NodeAllocatableEnforced  8m                     kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  8m                     kubelet          Node ha-949000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    8m                     kubelet          Node ha-949000-m02 status is now: NodeHasNoDiskPressure
	  Normal   RegisteredNode           7m53s                  node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   Starting                 5m36s                  kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  5m36s                  kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  5m35s (x8 over 5m36s)  kubelet          Node ha-949000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    5m35s (x8 over 5m36s)  kubelet          Node ha-949000-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     5m35s (x7 over 5m36s)  kubelet          Node ha-949000-m02 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           5m23s                  node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   RegisteredNode           5m5s                   node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   RegisteredNode           4m39s                  node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	
	
	Name:               ha-949000-m03
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-949000-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=ha-949000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_08_31T15_31_53_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:31:50 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-949000-m03
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:42:12 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:37:36 +0000   Sat, 31 Aug 2024 22:31:50 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:37:36 +0000   Sat, 31 Aug 2024 22:31:50 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:37:36 +0000   Sat, 31 Aug 2024 22:31:50 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:37:36 +0000   Sat, 31 Aug 2024 22:32:13 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.7
	  Hostname:    ha-949000-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 083559d81ea744cfa06123a4403e948b
	  System UUID:                3fde4d5b-0000-0000-8412-6ae6e5c787bb
	  Boot ID:                    05b327b3-b25f-4168-b6df-6e3d5b4df067
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.2.0/24
	PodCIDRs:                     10.244.2.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-vjf9x                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         10m
	  kube-system                 etcd-ha-949000-m03                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         10m
	  kube-system                 kindnet-9j85v                            100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      10m
	  kube-system                 kube-apiserver-ha-949000-m03             250m (12%)    0 (0%)      0 (0%)           0 (0%)         10m
	  kube-system                 kube-controller-manager-ha-949000-m03    200m (10%)    0 (0%)      0 (0%)           0 (0%)         10m
	  kube-system                 kube-proxy-d45q5                         0 (0%)        0 (0%)      0 (0%)           0 (0%)         10m
	  kube-system                 kube-scheduler-ha-949000-m03             100m (5%)     0 (0%)      0 (0%)           0 (0%)         10m
	  kube-system                 kube-vip-ha-949000-m03                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         10m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%)  100m (5%)
	  memory             150Mi (7%)  50Mi (2%)
	  ephemeral-storage  0 (0%)      0 (0%)
	  hugepages-2Mi      0 (0%)      0 (0%)
	Events:
	  Type     Reason                   Age                From             Message
	  ----     ------                   ----               ----             -------
	  Normal   Starting                 4m42s              kube-proxy       
	  Normal   Starting                 10m                kube-proxy       
	  Normal   NodeAllocatableEnforced  10m                kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  10m (x8 over 10m)  kubelet          Node ha-949000-m03 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    10m (x8 over 10m)  kubelet          Node ha-949000-m03 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     10m (x7 over 10m)  kubelet          Node ha-949000-m03 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           10m                node-controller  Node ha-949000-m03 event: Registered Node ha-949000-m03 in Controller
	  Normal   RegisteredNode           10m                node-controller  Node ha-949000-m03 event: Registered Node ha-949000-m03 in Controller
	  Normal   RegisteredNode           10m                node-controller  Node ha-949000-m03 event: Registered Node ha-949000-m03 in Controller
	  Normal   RegisteredNode           7m53s              node-controller  Node ha-949000-m03 event: Registered Node ha-949000-m03 in Controller
	  Normal   RegisteredNode           5m23s              node-controller  Node ha-949000-m03 event: Registered Node ha-949000-m03 in Controller
	  Normal   RegisteredNode           5m5s               node-controller  Node ha-949000-m03 event: Registered Node ha-949000-m03 in Controller
	  Normal   Starting                 4m46s              kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  4m46s              kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  4m46s              kubelet          Node ha-949000-m03 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    4m46s              kubelet          Node ha-949000-m03 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     4m46s              kubelet          Node ha-949000-m03 status is now: NodeHasSufficientPID
	  Warning  Rebooted                 4m46s              kubelet          Node ha-949000-m03 has been rebooted, boot id: 05b327b3-b25f-4168-b6df-6e3d5b4df067
	  Normal   RegisteredNode           4m39s              node-controller  Node ha-949000-m03 event: Registered Node ha-949000-m03 in Controller
	
	
	==> dmesg <==
	[  +0.000001] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.036538] ACPI BIOS Warning (bug): Incorrect checksum in table [DSDT] - 0xBE, should be 0x1B (20200925/tbprint-173)
	[  +0.008025] RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible!
	[  +5.657655] ACPI Error: Could not enable RealTimeClock event (20200925/evxfevnt-182)
	[  +0.000002] ACPI Warning: Could not enable fixed event - RealTimeClock (4) (20200925/evxface-618)
	[  +0.007505] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.775908] systemd-fstab-generator[127]: Ignoring "noauto" option for root device
	[  +2.226303] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000003] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000001] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +0.522399] systemd-fstab-generator[463]: Ignoring "noauto" option for root device
	[  +0.101678] systemd-fstab-generator[475]: Ignoring "noauto" option for root device
	[  +1.969329] systemd-fstab-generator[1097]: Ignoring "noauto" option for root device
	[  +0.262499] systemd-fstab-generator[1134]: Ignoring "noauto" option for root device
	[  +0.055714] kauditd_printk_skb: 101 callbacks suppressed
	[  +0.044427] systemd-fstab-generator[1146]: Ignoring "noauto" option for root device
	[  +0.122906] systemd-fstab-generator[1160]: Ignoring "noauto" option for root device
	[  +2.475814] systemd-fstab-generator[1375]: Ignoring "noauto" option for root device
	[  +0.112565] systemd-fstab-generator[1387]: Ignoring "noauto" option for root device
	[  +0.102686] systemd-fstab-generator[1399]: Ignoring "noauto" option for root device
	[  +0.126445] systemd-fstab-generator[1414]: Ignoring "noauto" option for root device
	[  +0.454968] systemd-fstab-generator[1576]: Ignoring "noauto" option for root device
	[  +6.916629] kauditd_printk_skb: 212 callbacks suppressed
	[ +21.586391] kauditd_printk_skb: 40 callbacks suppressed
	[Aug31 22:37] kauditd_printk_skb: 83 callbacks suppressed
	
	
	==> etcd [2255978551ea] <==
	{"level":"warn","ts":"2024-08-31T22:37:27.532567Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:37:27.632410Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:37:27.682960Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:37:27.688618Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:37:27.690454Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:37:27.692109Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:37:27.714437Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:37:27.717160Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:37:27.718234Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:37:27.732376Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-08-31T22:37:30.499052Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"6bcd180d94f2f42","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: no route to host"}
	{"level":"warn","ts":"2024-08-31T22:37:30.499644Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"6bcd180d94f2f42","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: no route to host"}
	{"level":"warn","ts":"2024-08-31T22:37:30.988815Z","caller":"etcdserver/cluster_util.go:294","msg":"failed to reach the peer URL","address":"https://192.169.0.7:2380/version","remote-member-id":"6bcd180d94f2f42","error":"Get \"https://192.169.0.7:2380/version\": dial tcp 192.169.0.7:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-08-31T22:37:30.988942Z","caller":"etcdserver/cluster_util.go:158","msg":"failed to get version","remote-member-id":"6bcd180d94f2f42","error":"Get \"https://192.169.0.7:2380/version\": dial tcp 192.169.0.7:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-08-31T22:37:34.990992Z","caller":"etcdserver/cluster_util.go:294","msg":"failed to reach the peer URL","address":"https://192.169.0.7:2380/version","remote-member-id":"6bcd180d94f2f42","error":"Get \"https://192.169.0.7:2380/version\": dial tcp 192.169.0.7:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-08-31T22:37:34.991044Z","caller":"etcdserver/cluster_util.go:158","msg":"failed to get version","remote-member-id":"6bcd180d94f2f42","error":"Get \"https://192.169.0.7:2380/version\": dial tcp 192.169.0.7:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-08-31T22:37:35.499498Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"6bcd180d94f2f42","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-08-31T22:37:35.500810Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"6bcd180d94f2f42","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: connection refused"}
	{"level":"info","ts":"2024-08-31T22:37:38.086004Z","caller":"rafthttp/peer_status.go:53","msg":"peer became active","peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:37:38.086155Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:37:38.088468Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:37:38.103271Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"6bcd180d94f2f42","stream-type":"stream Message"}
	{"level":"info","ts":"2024-08-31T22:37:38.103349Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:37:38.121926Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"6bcd180d94f2f42","stream-type":"stream MsgApp v2"}
	{"level":"info","ts":"2024-08-31T22:37:38.122172Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	
	
	==> etcd [c734c23a5308] <==
	{"level":"info","ts":"2024-08-31T22:36:02.089341Z","caller":"traceutil/trace.go:171","msg":"trace[1950473945] range","detail":"{range_begin:/registry/secrets/; range_end:/registry/secrets0; }","duration":"5.07880235s","start":"2024-08-31T22:35:57.010534Z","end":"2024-08-31T22:36:02.089336Z","steps":["trace[1950473945] 'agreement among raft nodes before linearized reading'  (duration: 5.078744702s)"],"step_count":1}
	{"level":"warn","ts":"2024-08-31T22:36:02.089376Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-08-31T22:35:57.010497Z","time spent":"5.078873485s","remote":"127.0.0.1:50354","response type":"/etcdserverpb.KV/Range","request count":0,"request size":42,"response count":0,"response size":0,"request content":"key:\"/registry/secrets/\" range_end:\"/registry/secrets0\" count_only:true "}
	2024/08/31 22:36:02 WARNING: [core] [Server #8] grpc: Server.processUnaryRPC failed to write status: connection error: desc = "transport is closing"
	{"level":"warn","ts":"2024-08-31T22:36:02.089450Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"3.731895172s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/statefulsets/\" range_end:\"/registry/statefulsets0\" count_only:true ","response":"","error":"context canceled"}
	{"level":"info","ts":"2024-08-31T22:36:02.089464Z","caller":"traceutil/trace.go:171","msg":"trace[1668294552] range","detail":"{range_begin:/registry/statefulsets/; range_end:/registry/statefulsets0; }","duration":"3.731928485s","start":"2024-08-31T22:35:58.357532Z","end":"2024-08-31T22:36:02.089460Z","steps":["trace[1668294552] 'agreement among raft nodes before linearized reading'  (duration: 3.731895116s)"],"step_count":1}
	{"level":"warn","ts":"2024-08-31T22:36:02.089476Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-08-31T22:35:58.357516Z","time spent":"3.731956501s","remote":"127.0.0.1:50712","response type":"/etcdserverpb.KV/Range","request count":0,"request size":52,"response count":0,"response size":0,"request content":"key:\"/registry/statefulsets/\" range_end:\"/registry/statefulsets0\" count_only:true "}
	2024/08/31 22:36:02 WARNING: [core] [Server #8] grpc: Server.processUnaryRPC failed to write status: connection error: desc = "transport is closing"
	{"level":"info","ts":"2024-08-31T22:36:02.126515Z","caller":"etcdserver/server.go:1512","msg":"skipped leadership transfer; local server is not leader","local-member-id":"b8c6c7563d17d844","current-leader-member-id":"0"}
	{"level":"info","ts":"2024-08-31T22:36:02.127073Z","caller":"rafthttp/peer.go:330","msg":"stopping remote peer","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:36:02.127125Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:36:02.127142Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream Message","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:36:02.127279Z","caller":"rafthttp/pipeline.go:85","msg":"stopped HTTP pipelining with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:36:02.127328Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:36:02.127353Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:36:02.127363Z","caller":"rafthttp/peer.go:335","msg":"stopped remote peer","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:36:02.127367Z","caller":"rafthttp/peer.go:330","msg":"stopping remote peer","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:36:02.127373Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:36:02.127406Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream Message","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:36:02.127962Z","caller":"rafthttp/pipeline.go:85","msg":"stopped HTTP pipelining with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:36:02.128009Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:36:02.128078Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:36:02.128107Z","caller":"rafthttp/peer.go:335","msg":"stopped remote peer","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:36:02.129535Z","caller":"embed/etcd.go:581","msg":"stopping serving peer traffic","address":"192.169.0.5:2380"}
	{"level":"info","ts":"2024-08-31T22:36:02.129687Z","caller":"embed/etcd.go:586","msg":"stopped serving peer traffic","address":"192.169.0.5:2380"}
	{"level":"info","ts":"2024-08-31T22:36:02.129696Z","caller":"embed/etcd.go:379","msg":"closed etcd server","name":"ha-949000","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.169.0.5:2380"],"advertise-client-urls":["https://192.169.0.5:2379"]}
	
	
	==> kernel <==
	 22:42:23 up 6 min,  0 users,  load average: 0.20, 0.33, 0.18
	Linux ha-949000 5.10.207 #1 SMP Wed Aug 28 20:54:17 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [6d156ce62611] <==
	I0831 22:35:15.620720       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:35:25.613908       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:35:25.614028       1 main.go:299] handling current node
	I0831 22:35:25.614079       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:35:25.614094       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:35:25.614736       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:35:25.614790       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:35:35.621230       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:35:35.621411       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:35:35.621574       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:35:35.621705       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:35:35.621830       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:35:35.621998       1 main.go:299] handling current node
	I0831 22:35:45.622596       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:35:45.622733       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:35:45.623036       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:35:45.623089       1 main.go:299] handling current node
	I0831 22:35:45.623265       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:35:45.623338       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:35:55.614888       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:35:55.614962       1 main.go:299] handling current node
	I0831 22:35:55.614980       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:35:55.614989       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:35:55.615216       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:35:55.615320       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	
	
	==> kindnet [ff98d7e38a1e] <==
	I0831 22:41:36.423611       1 main.go:299] handling current node
	I0831 22:41:46.423117       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:41:46.423139       1 main.go:299] handling current node
	I0831 22:41:46.423150       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:41:46.423154       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:41:46.423374       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:41:46.423426       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:41:56.421263       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:41:56.421342       1 main.go:299] handling current node
	I0831 22:41:56.421361       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:41:56.421371       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:41:56.421483       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:41:56.421556       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:42:06.419300       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:42:06.419355       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:42:06.419448       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:42:06.419540       1 main.go:299] handling current node
	I0831 22:42:06.419587       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:42:06.419596       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:42:16.418758       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:42:16.418878       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:42:16.419144       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:42:16.419199       1 main.go:299] handling current node
	I0831 22:42:16.419230       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:42:16.419256       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	
	
	==> kube-apiserver [fa476ce36b90] <==
	I0831 22:36:55.851684       1 controller.go:119] Starting legacy_token_tracking_controller
	I0831 22:36:55.873485       1 shared_informer.go:313] Waiting for caches to sync for configmaps
	I0831 22:36:55.948972       1 apf_controller.go:382] Running API Priority and Fairness config worker
	I0831 22:36:55.949005       1 apf_controller.go:385] Running API Priority and Fairness periodic rebalancing process
	I0831 22:36:55.949434       1 shared_informer.go:320] Caches are synced for cluster_authentication_trust_controller
	I0831 22:36:55.949812       1 cache.go:39] Caches are synced for RemoteAvailability controller
	I0831 22:36:55.953147       1 cache.go:39] Caches are synced for LocalAvailability controller
	I0831 22:36:55.953575       1 cache.go:39] Caches are synced for APIServiceRegistrationController controller
	I0831 22:36:55.954480       1 shared_informer.go:320] Caches are synced for crd-autoregister
	I0831 22:36:55.954969       1 aggregator.go:171] initial CRD sync complete...
	I0831 22:36:55.955092       1 autoregister_controller.go:144] Starting autoregister controller
	I0831 22:36:55.955194       1 cache.go:32] Waiting for caches to sync for autoregister controller
	I0831 22:36:55.955309       1 cache.go:39] Caches are synced for autoregister controller
	I0831 22:36:55.957677       1 handler_discovery.go:450] Starting ResourceDiscoveryManager
	W0831 22:36:55.960494       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.169.0.6]
	I0831 22:36:55.974621       1 shared_informer.go:320] Caches are synced for configmaps
	I0831 22:36:55.982646       1 shared_informer.go:320] Caches are synced for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0831 22:36:55.982788       1 shared_informer.go:320] Caches are synced for node_authorizer
	I0831 22:36:55.982866       1 policy_source.go:224] refreshing policies
	I0831 22:36:55.990600       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0831 22:36:56.065496       1 controller.go:615] quota admission added evaluator for: endpoints
	I0831 22:36:56.078415       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	E0831 22:36:56.080666       1 controller.go:95] Found stale data, removed previous endpoints on kubernetes service, apiserver didn't exit successfully previously
	I0831 22:36:56.858259       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	W0831 22:36:57.190605       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.169.0.5]
	
	
	==> kube-apiserver [ffec6106be6c] <==
	W0831 22:36:02.115125       1 logging.go:55] [core] [Channel #73 SubChannel #74]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.115222       1 logging.go:55] [core] [Channel #127 SubChannel #128]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.115245       1 logging.go:55] [core] [Channel #22 SubChannel #23]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.115261       1 logging.go:55] [core] [Channel #1 SubChannel #3]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.115276       1 logging.go:55] [core] [Channel #133 SubChannel #134]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119407       1 logging.go:55] [core] [Channel #157 SubChannel #158]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119539       1 logging.go:55] [core] [Channel #115 SubChannel #116]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119557       1 logging.go:55] [core] [Channel #46 SubChannel #47]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119573       1 logging.go:55] [core] [Channel #55 SubChannel #56]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119587       1 logging.go:55] [core] [Channel #25 SubChannel #26]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119602       1 logging.go:55] [core] [Channel #82 SubChannel #83]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119655       1 logging.go:55] [core] [Channel #79 SubChannel #80]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119675       1 logging.go:55] [core] [Channel #88 SubChannel #89]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119696       1 logging.go:55] [core] [Channel #94 SubChannel #95]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119711       1 logging.go:55] [core] [Channel #100 SubChannel #101]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119786       1 logging.go:55] [core] [Channel #130 SubChannel #131]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119813       1 logging.go:55] [core] [Channel #124 SubChannel #125]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119870       1 logging.go:55] [core] [Channel #76 SubChannel #77]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119955       1 logging.go:55] [core] [Channel #172 SubChannel #173]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119994       1 logging.go:55] [core] [Channel #121 SubChannel #122]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.120283       1 logging.go:55] [core] [Channel #154 SubChannel #155]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.120304       1 logging.go:55] [core] [Channel #151 SubChannel #152]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.120414       1 logging.go:55] [core] [Channel #142 SubChannel #143]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.120438       1 logging.go:55] [core] [Channel #166 SubChannel #167]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.114925       1 logging.go:55] [core] [Channel #145 SubChannel #146]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	
	
	==> kube-controller-manager [3dd9e3bd3e1f] <==
	I0831 22:37:17.099961       1 shared_informer.go:320] Caches are synced for certificate-csrsigning-kube-apiserver-client
	I0831 22:37:17.101681       1 shared_informer.go:320] Caches are synced for stateful set
	I0831 22:37:17.165112       1 shared_informer.go:320] Caches are synced for cronjob
	I0831 22:37:17.189821       1 shared_informer.go:320] Caches are synced for job
	I0831 22:37:17.193803       1 shared_informer.go:320] Caches are synced for TTL after finished
	I0831 22:37:17.211620       1 shared_informer.go:320] Caches are synced for resource quota
	I0831 22:37:17.228523       1 shared_informer.go:320] Caches are synced for resource quota
	I0831 22:37:17.620032       1 shared_informer.go:320] Caches are synced for garbage collector
	I0831 22:37:17.620047       1 shared_informer.go:320] Caches are synced for garbage collector
	I0831 22:37:17.620175       1 garbagecollector.go:157] "All resource monitors have synced. Proceeding to collect garbage" logger="garbage-collector-controller"
	I0831 22:37:36.866111       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:37:37.804268       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="44.68915ms"
	I0831 22:37:37.804420       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="106.831µs"
	I0831 22:37:40.196435       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="20.594737ms"
	I0831 22:37:40.196800       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="175.922µs"
	I0831 22:37:54.687068       1 endpointslice_controller.go:344] "Error syncing endpoint slices for service, retrying" logger="endpointslice-controller" key="kube-system/kube-dns" err="failed to update kube-dns-mxss9 EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io \"kube-dns-mxss9\": the object has been modified; please apply your changes to the latest version and try again"
	I0831 22:37:54.687554       1 event.go:377] Event(v1.ObjectReference{Kind:"Service", Namespace:"kube-system", Name:"kube-dns", UID:"c225b6ce-9d24-451b-aa4c-2f6d57886b05", APIVersion:"v1", ResourceVersion:"257", FieldPath:""}): type: 'Warning' reason: 'FailedToUpdateEndpointSlices' Error updating Endpoint Slices for Service kube-system/kube-dns: failed to update kube-dns-mxss9 EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io "kube-dns-mxss9": the object has been modified; please apply your changes to the latest version and try again
	I0831 22:37:54.697104       1 endpointslice_controller.go:344] "Error syncing endpoint slices for service, retrying" logger="endpointslice-controller" key="kube-system/kube-dns" err="failed to update kube-dns-mxss9 EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io \"kube-dns-mxss9\": the object has been modified; please apply your changes to the latest version and try again"
	I0831 22:37:54.697155       1 event.go:377] Event(v1.ObjectReference{Kind:"Service", Namespace:"kube-system", Name:"kube-dns", UID:"c225b6ce-9d24-451b-aa4c-2f6d57886b05", APIVersion:"v1", ResourceVersion:"257", FieldPath:""}): type: 'Warning' reason: 'FailedToUpdateEndpointSlices' Error updating Endpoint Slices for Service kube-system/kube-dns: failed to update kube-dns-mxss9 EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io "kube-dns-mxss9": the object has been modified; please apply your changes to the latest version and try again
	I0831 22:37:54.698321       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-6f6b679f8f" duration="73.860325ms"
	E0831 22:37:54.698593       1 replica_set.go:560] "Unhandled Error" err="sync \"kube-system/coredns-6f6b679f8f\" failed with Operation cannot be fulfilled on replicasets.apps \"coredns-6f6b679f8f\": the object has been modified; please apply your changes to the latest version and try again" logger="UnhandledError"
	I0831 22:37:54.701342       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-6f6b679f8f" duration="61.894µs"
	I0831 22:37:54.706798       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-6f6b679f8f" duration="103.162µs"
	I0831 22:42:02.055055       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m02"
	I0831 22:42:12.841493       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000"
	
	
	==> kube-controller-manager [740de9cc660e] <==
	I0831 22:36:36.160199       1 serving.go:386] Generated self-signed cert in-memory
	I0831 22:36:36.406066       1 controllermanager.go:197] "Starting" version="v1.31.0"
	I0831 22:36:36.406213       1 controllermanager.go:199] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:36:36.407965       1 dynamic_cafile_content.go:160] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0831 22:36:36.408151       1 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I0831 22:36:36.408699       1 secure_serving.go:213] Serving securely on 127.0.0.1:10257
	I0831 22:36:36.408792       1 tlsconfig.go:243] "Starting DynamicServingCertificateController"
	E0831 22:36:56.415496       1 controllermanager.go:242] "Error building controller context" err="failed to wait for apiserver being healthy: timed out waiting for the condition: failed to get apiserver /healthz status: an error on the server (\"[+]ping ok\\n[+]log ok\\n[+]etcd ok\\n[+]poststarthook/start-apiserver-admission-initializer ok\\n[+]poststarthook/generic-apiserver-start-informers ok\\n[+]poststarthook/priority-and-fairness-config-consumer ok\\n[+]poststarthook/priority-and-fairness-filter ok\\n[+]poststarthook/storage-object-count-tracker-hook ok\\n[+]poststarthook/start-apiextensions-informers ok\\n[+]poststarthook/start-apiextensions-controllers ok\\n[+]poststarthook/crd-informer-synced ok\\n[+]poststarthook/start-system-namespaces-controller ok\\n[+]poststarthook/start-cluster-authentication-info-controller ok\\n[+]poststarthook/start-kube-apiserver-identity-lease-controller ok\\n[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok\\n[+]poststarthook/start-legacy-to
ken-tracking-controller ok\\n[+]poststarthook/start-service-ip-repair-controllers ok\\n[-]poststarthook/rbac/bootstrap-roles failed: reason withheld\\n[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld\\n[+]poststarthook/priority-and-fairness-config-producer ok\\n[+]poststarthook/bootstrap-controller ok\\n[+]poststarthook/aggregator-reload-proxy-client-cert ok\\n[+]poststarthook/start-kube-aggregator-informers ok\\n[+]poststarthook/apiservice-status-local-available-controller ok\\n[+]poststarthook/apiservice-status-remote-available-controller ok\\n[+]poststarthook/apiservice-registration-controller ok\\n[+]poststarthook/apiservice-discovery-controller ok\\n[+]poststarthook/kube-apiserver-autoregistration ok\\n[+]autoregister-completion ok\\n[+]poststarthook/apiservice-openapi-controller ok\\n[+]poststarthook/apiservice-openapiv3-controller ok\\nhealthz check failed\") has prevented the request from succeeding"
	
	
	==> kube-proxy [54d5f8041c89] <==
		add table ip kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	E0831 22:29:49.977338       1 proxier.go:734] "Error cleaning up nftables rules" err=<
		could not run nftables command: /dev/stdin:1:1-25: Error: Could not process rule: Operation not supported
		add table ip6 kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	I0831 22:29:49.983071       1 server.go:677] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	E0831 22:29:49.983430       1 server.go:234] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I0831 22:29:50.023032       1 server_linux.go:146] "No iptables support for family" ipFamily="IPv6"
	I0831 22:29:50.023054       1 server.go:245] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0831 22:29:50.023070       1 server_linux.go:169] "Using iptables Proxier"
	I0831 22:29:50.025790       1 proxier.go:255] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I0831 22:29:50.026014       1 server.go:483] "Version info" version="v1.31.0"
	I0831 22:29:50.026061       1 server.go:485] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:29:50.026844       1 config.go:197] "Starting service config controller"
	I0831 22:29:50.027602       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0831 22:29:50.027141       1 config.go:104] "Starting endpoint slice config controller"
	I0831 22:29:50.027698       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0831 22:29:50.027260       1 config.go:326] "Starting node config controller"
	I0831 22:29:50.027720       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0831 22:29:50.128122       1 shared_informer.go:320] Caches are synced for node config
	I0831 22:29:50.128144       1 shared_informer.go:320] Caches are synced for service config
	I0831 22:29:50.128162       1 shared_informer.go:320] Caches are synced for endpoint slice config
	
	
	==> kube-proxy [f89b86206413] <==
		add table ip kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	E0831 22:37:16.195275       1 proxier.go:734] "Error cleaning up nftables rules" err=<
		could not run nftables command: /dev/stdin:1:1-25: Error: Could not process rule: Operation not supported
		add table ip6 kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	I0831 22:37:16.220357       1 server.go:677] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	E0831 22:37:16.220590       1 server.go:234] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I0831 22:37:16.265026       1 server_linux.go:146] "No iptables support for family" ipFamily="IPv6"
	I0831 22:37:16.265177       1 server.go:245] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0831 22:37:16.265305       1 server_linux.go:169] "Using iptables Proxier"
	I0831 22:37:16.268348       1 proxier.go:255] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I0831 22:37:16.268734       1 server.go:483] "Version info" version="v1.31.0"
	I0831 22:37:16.269061       1 server.go:485] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:37:16.272514       1 config.go:197] "Starting service config controller"
	I0831 22:37:16.273450       1 config.go:104] "Starting endpoint slice config controller"
	I0831 22:37:16.273658       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0831 22:37:16.273777       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0831 22:37:16.275413       1 config.go:326] "Starting node config controller"
	I0831 22:37:16.277042       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0831 22:37:16.374257       1 shared_informer.go:320] Caches are synced for service config
	I0831 22:37:16.375624       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0831 22:37:16.377606       1 shared_informer.go:320] Caches are synced for node config
	
	
	==> kube-scheduler [02c10e4f765d] <==
	E0831 22:29:42.107231       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicasets\" in API group \"apps\" at the cluster scope" logger="UnhandledError"
	W0831 22:29:42.111966       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	E0831 22:29:42.112045       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumes\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:29:42.116498       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	E0831 22:29:42.116539       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Namespace: failed to list *v1.Namespace: namespaces is forbidden: User \"system:kube-scheduler\" cannot list resource \"namespaces\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:29:42.129701       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0831 22:29:42.129741       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumeclaims\" in API group \"\" at the cluster scope" logger="UnhandledError"
	I0831 22:29:45.342252       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	E0831 22:31:50.464567       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kube-proxy-d45q5\": pod kube-proxy-d45q5 is already assigned to node \"ha-949000-m03\"" plugin="DefaultBinder" pod="kube-system/kube-proxy-d45q5" node="ha-949000-m03"
	E0831 22:31:50.464652       1 schedule_one.go:348] "scheduler cache ForgetPod failed" err="pod 9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2(kube-system/kube-proxy-d45q5) wasn't assumed so cannot be forgotten" pod="kube-system/kube-proxy-d45q5"
	E0831 22:31:50.464667       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kube-proxy-d45q5\": pod kube-proxy-d45q5 is already assigned to node \"ha-949000-m03\"" pod="kube-system/kube-proxy-d45q5"
	I0831 22:31:50.464683       1 schedule_one.go:1070] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kube-proxy-d45q5" node="ha-949000-m03"
	E0831 22:31:50.476710       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kindnet-l4zbh\": pod kindnet-l4zbh is already assigned to node \"ha-949000-m03\"" plugin="DefaultBinder" pod="kube-system/kindnet-l4zbh" node="ha-949000-m03"
	E0831 22:31:50.476756       1 schedule_one.go:348] "scheduler cache ForgetPod failed" err="pod c551bb18-9a7d-4fca-9724-be7900980a40(kube-system/kindnet-l4zbh) wasn't assumed so cannot be forgotten" pod="kube-system/kindnet-l4zbh"
	E0831 22:31:50.476767       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kindnet-l4zbh\": pod kindnet-l4zbh is already assigned to node \"ha-949000-m03\"" pod="kube-system/kindnet-l4zbh"
	I0831 22:31:50.476781       1 schedule_one.go:1070] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kindnet-l4zbh" node="ha-949000-m03"
	E0831 22:32:20.049491       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-6r9s5\": pod busybox-7dff88458-6r9s5 is already assigned to node \"ha-949000-m02\"" plugin="DefaultBinder" pod="default/busybox-7dff88458-6r9s5" node="ha-949000-m02"
	E0831 22:32:20.049618       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-6r9s5\": pod busybox-7dff88458-6r9s5 is already assigned to node \"ha-949000-m02\"" pod="default/busybox-7dff88458-6r9s5"
	E0831 22:32:20.071235       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-vjf9x\": pod busybox-7dff88458-vjf9x is already assigned to node \"ha-949000-m03\"" plugin="DefaultBinder" pod="default/busybox-7dff88458-vjf9x" node="ha-949000-m03"
	E0831 22:32:20.071466       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-vjf9x\": pod busybox-7dff88458-vjf9x is already assigned to node \"ha-949000-m03\"" pod="default/busybox-7dff88458-vjf9x"
	E0831 22:32:20.073498       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-5kkbw\": pod busybox-7dff88458-5kkbw is already assigned to node \"ha-949000\"" plugin="DefaultBinder" pod="default/busybox-7dff88458-5kkbw" node="ha-949000"
	E0831 22:32:20.073571       1 schedule_one.go:348] "scheduler cache ForgetPod failed" err="pod e97e21d8-a69e-451c-babd-6232e12aafe0(default/busybox-7dff88458-5kkbw) wasn't assumed so cannot be forgotten" pod="default/busybox-7dff88458-5kkbw"
	E0831 22:32:20.077323       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-5kkbw\": pod busybox-7dff88458-5kkbw is already assigned to node \"ha-949000\"" pod="default/busybox-7dff88458-5kkbw"
	I0831 22:32:20.077394       1 schedule_one.go:1070] "Pod has been assigned to node. Abort adding it back to queue." pod="default/busybox-7dff88458-5kkbw" node="ha-949000"
	E0831 22:36:01.972805       1 run.go:72] "command failed" err="finished without leader elect"
	
	
	==> kube-scheduler [5b0ac6b7faf7] <==
	I0831 22:36:35.937574       1 serving.go:386] Generated self-signed cert in-memory
	W0831 22:36:46.491998       1 authentication.go:370] Error looking up in-cluster authentication configuration: Get "https://192.169.0.5:8443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication": net/http: TLS handshake timeout
	W0831 22:36:46.492020       1 authentication.go:371] Continuing without authentication configuration. This may treat all requests as anonymous.
	W0831 22:36:46.492025       1 authentication.go:372] To require authentication configuration lookup to succeed, set --authentication-tolerate-lookup-failure=false
	I0831 22:36:55.901677       1 server.go:167] "Starting Kubernetes Scheduler" version="v1.31.0"
	I0831 22:36:55.901714       1 server.go:169] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:36:55.904943       1 configmap_cafile_content.go:205] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I0831 22:36:55.905195       1 secure_serving.go:213] Serving securely on 127.0.0.1:10259
	I0831 22:36:55.905729       1 tlsconfig.go:243] "Starting DynamicServingCertificateController"
	I0831 22:36:55.906036       1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I0831 22:36:56.006746       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Aug 31 22:37:28 ha-949000 kubelet[1583]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:37:46 ha-949000 kubelet[1583]: I0831 22:37:46.280595    1583 scope.go:117] "RemoveContainer" containerID="22fbb8a8e01ad38fed3f9768b042e769c8d9657caf4e54a98b959944ddbe952f"
	Aug 31 22:37:46 ha-949000 kubelet[1583]: I0831 22:37:46.280791    1583 scope.go:117] "RemoveContainer" containerID="c7ade311e2b6bcc0e1f37e83b236eaec5caafb139b65d92f8114faaed4aacb77"
	Aug 31 22:37:46 ha-949000 kubelet[1583]: E0831 22:37:46.280873    1583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-provisioner\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-provisioner pod=storage-provisioner_kube-system(03bcdd23-f7f2-45a9-ab95-91918e094226)\"" pod="kube-system/storage-provisioner" podUID="03bcdd23-f7f2-45a9-ab95-91918e094226"
	Aug 31 22:37:58 ha-949000 kubelet[1583]: I0831 22:37:58.311210    1583 scope.go:117] "RemoveContainer" containerID="c7ade311e2b6bcc0e1f37e83b236eaec5caafb139b65d92f8114faaed4aacb77"
	Aug 31 22:38:28 ha-949000 kubelet[1583]: E0831 22:38:28.334569    1583 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:38:28 ha-949000 kubelet[1583]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:38:28 ha-949000 kubelet[1583]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:38:28 ha-949000 kubelet[1583]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:38:28 ha-949000 kubelet[1583]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:39:28 ha-949000 kubelet[1583]: E0831 22:39:28.333827    1583 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:39:28 ha-949000 kubelet[1583]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:39:28 ha-949000 kubelet[1583]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:39:28 ha-949000 kubelet[1583]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:39:28 ha-949000 kubelet[1583]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:40:28 ha-949000 kubelet[1583]: E0831 22:40:28.335276    1583 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:40:28 ha-949000 kubelet[1583]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:40:28 ha-949000 kubelet[1583]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:40:28 ha-949000 kubelet[1583]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:40:28 ha-949000 kubelet[1583]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:41:28 ha-949000 kubelet[1583]: E0831 22:41:28.333999    1583 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:41:28 ha-949000 kubelet[1583]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:41:28 ha-949000 kubelet[1583]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:41:28 ha-949000 kubelet[1583]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:41:28 ha-949000 kubelet[1583]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:255: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p ha-949000 -n ha-949000
helpers_test.go:262: (dbg) Run:  kubectl --context ha-949000 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:286: <<< TestMultiControlPlane/serial/RestartClusterKeepsNodes FAILED: end of post-mortem logs <<<
helpers_test.go:287: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/RestartClusterKeepsNodes (408.58s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DeleteSecondaryNode (11.5s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DeleteSecondaryNode
ha_test.go:487: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 node delete m03 -v=7 --alsologtostderr
ha_test.go:487: (dbg) Done: out/minikube-darwin-amd64 -p ha-949000 node delete m03 -v=7 --alsologtostderr: (6.83019402s)
ha_test.go:493: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr
ha_test.go:493: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr: exit status 2 (357.658346ms)

                                                
                                                
-- stdout --
	ha-949000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 15:42:31.977617    3948 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:42:31.977940    3948 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:42:31.977946    3948 out.go:358] Setting ErrFile to fd 2...
	I0831 15:42:31.977949    3948 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:42:31.978121    3948 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:42:31.978309    3948 out.go:352] Setting JSON to false
	I0831 15:42:31.978331    3948 mustload.go:65] Loading cluster: ha-949000
	I0831 15:42:31.978374    3948 notify.go:220] Checking for updates...
	I0831 15:42:31.978648    3948 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:42:31.978663    3948 status.go:255] checking status of ha-949000 ...
	I0831 15:42:31.979026    3948 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:31.979080    3948 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:42:31.988158    3948 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51969
	I0831 15:42:31.988565    3948 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:42:31.989014    3948 main.go:141] libmachine: Using API Version  1
	I0831 15:42:31.989024    3948 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:42:31.989239    3948 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:42:31.989362    3948 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:42:31.989443    3948 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:42:31.989522    3948 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 3756
	I0831 15:42:31.990489    3948 status.go:330] ha-949000 host status = "Running" (err=<nil>)
	I0831 15:42:31.990513    3948 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:42:31.990761    3948 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:31.990790    3948 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:42:31.999643    3948 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51971
	I0831 15:42:31.999997    3948 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:42:32.000312    3948 main.go:141] libmachine: Using API Version  1
	I0831 15:42:32.000322    3948 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:42:32.000542    3948 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:42:32.000646    3948 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:42:32.000732    3948 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:42:32.000997    3948 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:32.001020    3948 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:42:32.009561    3948 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51973
	I0831 15:42:32.009881    3948 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:42:32.010178    3948 main.go:141] libmachine: Using API Version  1
	I0831 15:42:32.010189    3948 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:42:32.010422    3948 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:42:32.010553    3948 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:42:32.010697    3948 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:42:32.010718    3948 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:42:32.010803    3948 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:42:32.010882    3948 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:42:32.010961    3948 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:42:32.011036    3948 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:42:32.043684    3948 ssh_runner.go:195] Run: systemctl --version
	I0831 15:42:32.048050    3948 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:42:32.058721    3948 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:42:32.058743    3948 api_server.go:166] Checking apiserver status ...
	I0831 15:42:32.058791    3948 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:42:32.075653    3948 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2076/cgroup
	W0831 15:42:32.083580    3948 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2076/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:42:32.083627    3948 ssh_runner.go:195] Run: ls
	I0831 15:42:32.086855    3948 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:42:32.091067    3948 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:42:32.091086    3948 status.go:422] ha-949000 apiserver status = Running (err=<nil>)
	I0831 15:42:32.091095    3948 status.go:257] ha-949000 status: &{Name:ha-949000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:42:32.091109    3948 status.go:255] checking status of ha-949000-m02 ...
	I0831 15:42:32.091384    3948 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:32.091406    3948 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:42:32.100136    3948 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51977
	I0831 15:42:32.100472    3948 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:42:32.100828    3948 main.go:141] libmachine: Using API Version  1
	I0831 15:42:32.100845    3948 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:42:32.101043    3948 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:42:32.101161    3948 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:42:32.101256    3948 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:42:32.101333    3948 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 3763
	I0831 15:42:32.102307    3948 status.go:330] ha-949000-m02 host status = "Running" (err=<nil>)
	I0831 15:42:32.102315    3948 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:42:32.102560    3948 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:32.102580    3948 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:42:32.111006    3948 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51979
	I0831 15:42:32.111323    3948 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:42:32.111669    3948 main.go:141] libmachine: Using API Version  1
	I0831 15:42:32.111685    3948 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:42:32.111899    3948 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:42:32.111995    3948 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:42:32.112076    3948 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:42:32.112335    3948 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:32.112357    3948 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:42:32.121526    3948 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51981
	I0831 15:42:32.122031    3948 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:42:32.122381    3948 main.go:141] libmachine: Using API Version  1
	I0831 15:42:32.122392    3948 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:42:32.122620    3948 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:42:32.122732    3948 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:42:32.122849    3948 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:42:32.122859    3948 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:42:32.122952    3948 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:42:32.123077    3948 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:42:32.123159    3948 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:42:32.123254    3948 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:42:32.161858    3948 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:42:32.174687    3948 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:42:32.174702    3948 api_server.go:166] Checking apiserver status ...
	I0831 15:42:32.174737    3948 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:42:32.186803    3948 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2120/cgroup
	W0831 15:42:32.195836    3948 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2120/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:42:32.195885    3948 ssh_runner.go:195] Run: ls
	I0831 15:42:32.199140    3948 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:42:32.202245    3948 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:42:32.202258    3948 status.go:422] ha-949000-m02 apiserver status = Running (err=<nil>)
	I0831 15:42:32.202269    3948 status.go:257] ha-949000-m02 status: &{Name:ha-949000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:42:32.202280    3948 status.go:255] checking status of ha-949000-m04 ...
	I0831 15:42:32.202545    3948 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:32.202566    3948 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:42:32.211083    3948 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51985
	I0831 15:42:32.211426    3948 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:42:32.211770    3948 main.go:141] libmachine: Using API Version  1
	I0831 15:42:32.211788    3948 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:42:32.212011    3948 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:42:32.212112    3948 main.go:141] libmachine: (ha-949000-m04) Calling .GetState
	I0831 15:42:32.212196    3948 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:42:32.212290    3948 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3806
	I0831 15:42:32.213280    3948 status.go:330] ha-949000-m04 host status = "Running" (err=<nil>)
	I0831 15:42:32.213290    3948 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:42:32.213556    3948 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:32.213579    3948 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:42:32.222155    3948 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51987
	I0831 15:42:32.222491    3948 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:42:32.222821    3948 main.go:141] libmachine: Using API Version  1
	I0831 15:42:32.222843    3948 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:42:32.223041    3948 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:42:32.223146    3948 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:42:32.223233    3948 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:42:32.223499    3948 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:32.223521    3948 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:42:32.232055    3948 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51989
	I0831 15:42:32.232409    3948 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:42:32.232724    3948 main.go:141] libmachine: Using API Version  1
	I0831 15:42:32.232740    3948 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:42:32.232964    3948 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:42:32.233070    3948 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:42:32.233183    3948 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:42:32.233193    3948 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:42:32.233278    3948 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:42:32.233359    3948 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:42:32.233435    3948 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:42:32.233518    3948 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:42:32.268702    3948 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:42:32.279205    3948 status.go:257] ha-949000-m04 status: &{Name:ha-949000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:495: failed to run minikube status. args "out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr" : exit status 2
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-949000 -n ha-949000
helpers_test.go:245: <<< TestMultiControlPlane/serial/DeleteSecondaryNode FAILED: start of post-mortem logs <<<
helpers_test.go:246: ======>  post-mortem[TestMultiControlPlane/serial/DeleteSecondaryNode]: minikube logs <======
helpers_test.go:248: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 logs -n 25
helpers_test.go:248: (dbg) Done: out/minikube-darwin-amd64 -p ha-949000 logs -n 25: (3.319803909s)
helpers_test.go:253: TestMultiControlPlane/serial/DeleteSecondaryNode logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| kubectl | -p ha-949000 -- get pods -o          | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5 -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- get pods -o          | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5 -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| node    | add -p ha-949000 -v=7                | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-949000 node stop m02 -v=7         | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:33 PDT | 31 Aug 24 15:33 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-949000 node start m02 -v=7        | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:34 PDT | 31 Aug 24 15:34 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-949000 -v=7               | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:35 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| stop    | -p ha-949000 -v=7                    | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:35 PDT | 31 Aug 24 15:36 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| start   | -p ha-949000 --wait=true -v=7        | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:36 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-949000                    | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:42 PDT |                     |
	| node    | ha-949000 node delete m03 -v=7       | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:42 PDT | 31 Aug 24 15:42 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/08/31 15:36:09
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0831 15:36:09.764310    3744 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:36:09.764592    3744 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:36:09.764597    3744 out.go:358] Setting ErrFile to fd 2...
	I0831 15:36:09.764601    3744 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:36:09.764770    3744 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:36:09.766289    3744 out.go:352] Setting JSON to false
	I0831 15:36:09.790255    3744 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":2140,"bootTime":1725141629,"procs":434,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.6.1","kernelVersion":"23.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0831 15:36:09.790362    3744 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0831 15:36:09.812967    3744 out.go:177] * [ha-949000] minikube v1.33.1 on Darwin 14.6.1
	I0831 15:36:09.857017    3744 out.go:177]   - MINIKUBE_LOCATION=18943
	I0831 15:36:09.857063    3744 notify.go:220] Checking for updates...
	I0831 15:36:09.900714    3744 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:36:09.921979    3744 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0831 15:36:09.948841    3744 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0831 15:36:09.970509    3744 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:36:09.991512    3744 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0831 15:36:10.013794    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:36:10.013954    3744 driver.go:392] Setting default libvirt URI to qemu:///system
	I0831 15:36:10.014628    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:36:10.014709    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:36:10.024181    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51800
	I0831 15:36:10.024557    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:36:10.024973    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:36:10.024981    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:36:10.025208    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:36:10.025338    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:10.053425    3744 out.go:177] * Using the hyperkit driver based on existing profile
	I0831 15:36:10.095518    3744 start.go:297] selected driver: hyperkit
	I0831 15:36:10.095547    3744 start.go:901] validating driver "hyperkit" against &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime: ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:fals
e efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p20
00.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:36:10.095803    3744 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0831 15:36:10.095991    3744 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:36:10.096192    3744 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18943-957/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0831 15:36:10.105897    3744 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0831 15:36:10.111634    3744 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:36:10.111657    3744 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0831 15:36:10.114891    3744 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:36:10.114962    3744 cni.go:84] Creating CNI manager for ""
	I0831 15:36:10.114970    3744 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0831 15:36:10.115051    3744 start.go:340] cluster config:
	{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.16
9.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-t
iller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0
MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:36:10.115155    3744 iso.go:125] acquiring lock: {Name:mk6e91575b208577856769ef01f8e000bc57c787 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:36:10.157575    3744 out.go:177] * Starting "ha-949000" primary control-plane node in "ha-949000" cluster
	I0831 15:36:10.178565    3744 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:36:10.178634    3744 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4
	I0831 15:36:10.178661    3744 cache.go:56] Caching tarball of preloaded images
	I0831 15:36:10.178859    3744 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:36:10.178882    3744 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:36:10.179080    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:36:10.179968    3744 start.go:360] acquireMachinesLock for ha-949000: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:36:10.180093    3744 start.go:364] duration metric: took 100.253µs to acquireMachinesLock for "ha-949000"
	I0831 15:36:10.180125    3744 start.go:96] Skipping create...Using existing machine configuration
	I0831 15:36:10.180144    3744 fix.go:54] fixHost starting: 
	I0831 15:36:10.180570    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:36:10.180626    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:36:10.189873    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51802
	I0831 15:36:10.190215    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:36:10.190587    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:36:10.190602    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:36:10.190832    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:36:10.190956    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:10.191047    3744 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:36:10.191129    3744 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:36:10.191205    3744 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 2887
	I0831 15:36:10.192132    3744 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid 2887 missing from process table
	I0831 15:36:10.192166    3744 fix.go:112] recreateIfNeeded on ha-949000: state=Stopped err=<nil>
	I0831 15:36:10.192185    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	W0831 15:36:10.192270    3744 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 15:36:10.235417    3744 out.go:177] * Restarting existing hyperkit VM for "ha-949000" ...
	I0831 15:36:10.258400    3744 main.go:141] libmachine: (ha-949000) Calling .Start
	I0831 15:36:10.258670    3744 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:36:10.258717    3744 main.go:141] libmachine: (ha-949000) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid
	I0831 15:36:10.260851    3744 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid 2887 missing from process table
	I0831 15:36:10.260866    3744 main.go:141] libmachine: (ha-949000) DBG | pid 2887 is in state "Stopped"
	I0831 15:36:10.260894    3744 main.go:141] libmachine: (ha-949000) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid...
	I0831 15:36:10.261058    3744 main.go:141] libmachine: (ha-949000) DBG | Using UUID 98cab9ba-901d-49d1-9e6c-321a4533d56e
	I0831 15:36:10.370955    3744 main.go:141] libmachine: (ha-949000) DBG | Generated MAC ce:8:77:f7:42:5e
	I0831 15:36:10.370980    3744 main.go:141] libmachine: (ha-949000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:36:10.371093    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"98cab9ba-901d-49d1-9e6c-321a4533d56e", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003a6900)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:36:10.371127    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"98cab9ba-901d-49d1-9e6c-321a4533d56e", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003a6900)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:36:10.371175    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "98cab9ba-901d-49d1-9e6c-321a4533d56e", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd,earlyprintk=serial l
oglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:36:10.371220    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 98cab9ba-901d-49d1-9e6c-321a4533d56e -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset noresto
re waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:36:10.371232    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:36:10.372813    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 DEBUG: hyperkit: Pid is 3756
	I0831 15:36:10.373286    3744 main.go:141] libmachine: (ha-949000) DBG | Attempt 0
	I0831 15:36:10.373298    3744 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:36:10.373398    3744 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 3756
	I0831 15:36:10.375146    3744 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:36:10.375210    3744 main.go:141] libmachine: (ha-949000) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0831 15:36:10.375229    3744 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ebe4}
	I0831 15:36:10.375249    3744 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d4eb85}
	I0831 15:36:10.375272    3744 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d4eb32}
	I0831 15:36:10.375287    3744 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4eabf}
	I0831 15:36:10.375330    3744 main.go:141] libmachine: (ha-949000) DBG | Found match: ce:8:77:f7:42:5e
	I0831 15:36:10.375341    3744 main.go:141] libmachine: (ha-949000) Calling .GetConfigRaw
	I0831 15:36:10.375350    3744 main.go:141] libmachine: (ha-949000) DBG | IP: 192.169.0.5
	I0831 15:36:10.376038    3744 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:36:10.376245    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:36:10.376722    3744 machine.go:93] provisionDockerMachine start ...
	I0831 15:36:10.376735    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:10.376898    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:10.377023    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:10.377121    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:10.377226    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:10.377318    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:10.377457    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:10.377688    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:36:10.377699    3744 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 15:36:10.380749    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:36:10.432938    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:36:10.433650    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:36:10.433669    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:36:10.433677    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:36:10.433685    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:36:10.813736    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:36:10.813750    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:36:10.928786    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:36:10.928808    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:36:10.928820    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:36:10.928840    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:36:10.929718    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:36:10.929729    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:10 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:36:16.483580    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:16 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:36:16.483594    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:16 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:36:16.483602    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:16 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:36:16.508100    3744 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:36:16 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:36:21.446393    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 15:36:21.446406    3744 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:36:21.446553    3744 buildroot.go:166] provisioning hostname "ha-949000"
	I0831 15:36:21.446562    3744 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:36:21.446665    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:21.446786    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:21.446905    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.447025    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.447124    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:21.447308    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:21.447472    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:36:21.447480    3744 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000 && echo "ha-949000" | sudo tee /etc/hostname
	I0831 15:36:21.524007    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000
	
	I0831 15:36:21.524025    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:21.524158    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:21.524268    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.524375    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.524479    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:21.524631    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:21.524781    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:36:21.524792    3744 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:36:21.591782    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:36:21.591802    3744 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:36:21.591822    3744 buildroot.go:174] setting up certificates
	I0831 15:36:21.591828    3744 provision.go:84] configureAuth start
	I0831 15:36:21.591834    3744 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:36:21.591970    3744 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:36:21.592077    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:21.592183    3744 provision.go:143] copyHostCerts
	I0831 15:36:21.592217    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:36:21.592287    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:36:21.592295    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:36:21.592443    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:36:21.592667    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:36:21.592706    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:36:21.592710    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:36:21.592784    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:36:21.592937    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:36:21.592978    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:36:21.592983    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:36:21.593095    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:36:21.593248    3744 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000 san=[127.0.0.1 192.169.0.5 ha-949000 localhost minikube]
	I0831 15:36:21.710940    3744 provision.go:177] copyRemoteCerts
	I0831 15:36:21.710993    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:36:21.711008    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:21.711135    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:21.711246    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.711328    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:21.711434    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:36:21.747436    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:36:21.747514    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:36:21.767330    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:36:21.767390    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1196 bytes)
	I0831 15:36:21.787147    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:36:21.787210    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0831 15:36:21.806851    3744 provision.go:87] duration metric: took 215.008206ms to configureAuth
	I0831 15:36:21.806864    3744 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:36:21.807028    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:36:21.807041    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:21.807176    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:21.807304    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:21.807387    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.807476    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.807574    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:21.807684    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:21.807812    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:36:21.807819    3744 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:36:21.869123    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:36:21.869137    3744 buildroot.go:70] root file system type: tmpfs
	I0831 15:36:21.869215    3744 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:36:21.869228    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:21.869368    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:21.869456    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.869553    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.869651    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:21.869776    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:21.869915    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:36:21.869959    3744 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:36:21.941116    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:36:21.941136    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:21.941270    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:21.941365    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.941441    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:21.941529    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:21.941663    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:21.941807    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:36:21.941819    3744 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:36:23.639328    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:36:23.639343    3744 machine.go:96] duration metric: took 13.26247014s to provisionDockerMachine
	I0831 15:36:23.639354    3744 start.go:293] postStartSetup for "ha-949000" (driver="hyperkit")
	I0831 15:36:23.639362    3744 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:36:23.639372    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:23.639572    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:36:23.639587    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:23.639684    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:23.639792    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:23.639927    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:23.640026    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:36:23.679356    3744 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:36:23.683676    3744 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:36:23.683690    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:36:23.683793    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:36:23.683980    3744 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:36:23.683987    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:36:23.684187    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:36:23.697074    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:36:23.724665    3744 start.go:296] duration metric: took 85.300709ms for postStartSetup
	I0831 15:36:23.724694    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:23.724869    3744 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0831 15:36:23.724883    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:23.724980    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:23.725089    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:23.725189    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:23.725280    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:36:23.763464    3744 machine.go:197] restoring vm config from /var/lib/minikube/backup: [etc]
	I0831 15:36:23.763527    3744 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0831 15:36:23.797396    3744 fix.go:56] duration metric: took 13.617113477s for fixHost
	I0831 15:36:23.797420    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:23.797554    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:23.797655    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:23.797749    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:23.797839    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:23.797970    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:23.798114    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:36:23.798122    3744 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:36:23.858158    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143783.919023246
	
	I0831 15:36:23.858170    3744 fix.go:216] guest clock: 1725143783.919023246
	I0831 15:36:23.858175    3744 fix.go:229] Guest: 2024-08-31 15:36:23.919023246 -0700 PDT Remote: 2024-08-31 15:36:23.79741 -0700 PDT m=+14.070978631 (delta=121.613246ms)
	I0831 15:36:23.858196    3744 fix.go:200] guest clock delta is within tolerance: 121.613246ms
	I0831 15:36:23.858200    3744 start.go:83] releasing machines lock for "ha-949000", held for 13.677948956s
	I0831 15:36:23.858225    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:23.858359    3744 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:36:23.858452    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:23.858730    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:23.858831    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:23.858919    3744 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:36:23.858951    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:23.858972    3744 ssh_runner.go:195] Run: cat /version.json
	I0831 15:36:23.858983    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:23.859063    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:23.859085    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:23.859194    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:23.859214    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:23.859295    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:23.859309    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:23.859385    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:36:23.859397    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:36:23.890757    3744 ssh_runner.go:195] Run: systemctl --version
	I0831 15:36:23.938659    3744 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0831 15:36:23.943864    3744 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:36:23.943901    3744 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:36:23.956026    3744 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:36:23.956039    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:36:23.956147    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:36:23.971422    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:36:23.980435    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:36:23.989142    3744 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:36:23.989181    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:36:23.997930    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:36:24.006635    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:36:24.015080    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:36:24.023671    3744 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:36:24.032589    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:36:24.041364    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:36:24.050087    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:36:24.058866    3744 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:36:24.066704    3744 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:36:24.074622    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:24.168184    3744 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:36:24.187633    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:36:24.187713    3744 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:36:24.206675    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:36:24.220212    3744 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:36:24.240424    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:36:24.250685    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:36:24.261052    3744 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:36:24.286854    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:36:24.297197    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:36:24.312454    3744 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:36:24.315602    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:36:24.323102    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:36:24.337130    3744 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:36:24.434813    3744 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:36:24.537809    3744 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:36:24.537887    3744 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:36:24.552112    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:24.656146    3744 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:36:26.992775    3744 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.336585914s)
	I0831 15:36:26.992844    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:36:27.003992    3744 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:36:27.018708    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:36:27.029918    3744 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:36:27.137311    3744 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:36:27.239047    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:27.342173    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:36:27.356192    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:36:27.367097    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:27.470187    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:36:27.536105    3744 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:36:27.536192    3744 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:36:27.540763    3744 start.go:563] Will wait 60s for crictl version
	I0831 15:36:27.540810    3744 ssh_runner.go:195] Run: which crictl
	I0831 15:36:27.544037    3744 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:36:27.570291    3744 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:36:27.570367    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:36:27.588378    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:36:27.648285    3744 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:36:27.648336    3744 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:36:27.648820    3744 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:36:27.653344    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:36:27.662997    3744 kubeadm.go:883] updating cluster {Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.
0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false f
reshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID
:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0831 15:36:27.663083    3744 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:36:27.663134    3744 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 15:36:27.676654    3744 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240813-c6f155d6
	registry.k8s.io/kube-apiserver:v1.31.0
	registry.k8s.io/kube-controller-manager:v1.31.0
	registry.k8s.io/kube-scheduler:v1.31.0
	registry.k8s.io/kube-proxy:v1.31.0
	registry.k8s.io/etcd:3.5.15-0
	registry.k8s.io/pause:3.10
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/coredns/coredns:v1.11.1
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0831 15:36:27.676670    3744 docker.go:615] Images already preloaded, skipping extraction
	I0831 15:36:27.676747    3744 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 15:36:27.690446    3744 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240813-c6f155d6
	registry.k8s.io/kube-apiserver:v1.31.0
	registry.k8s.io/kube-controller-manager:v1.31.0
	registry.k8s.io/kube-scheduler:v1.31.0
	registry.k8s.io/kube-proxy:v1.31.0
	registry.k8s.io/etcd:3.5.15-0
	registry.k8s.io/pause:3.10
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/coredns/coredns:v1.11.1
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0831 15:36:27.690466    3744 cache_images.go:84] Images are preloaded, skipping loading
	I0831 15:36:27.690484    3744 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.31.0 docker true true} ...
	I0831 15:36:27.690565    3744 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:36:27.690634    3744 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0831 15:36:27.729077    3744 cni.go:84] Creating CNI manager for ""
	I0831 15:36:27.729090    3744 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0831 15:36:27.729101    3744 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0831 15:36:27.729122    3744 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.31.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-949000 NodeName:ha-949000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0831 15:36:27.729202    3744 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-949000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.31.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0831 15:36:27.729215    3744 kube-vip.go:115] generating kube-vip config ...
	I0831 15:36:27.729267    3744 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:36:27.741901    3744 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:36:27.741972    3744 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:36:27.742025    3744 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:36:27.751754    3744 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:36:27.751799    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0831 15:36:27.759784    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0831 15:36:27.773166    3744 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:36:27.786640    3744 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0831 15:36:27.800639    3744 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:36:27.814083    3744 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:36:27.817014    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:36:27.827332    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:27.924726    3744 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:36:27.939552    3744 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.5
	I0831 15:36:27.939571    3744 certs.go:194] generating shared ca certs ...
	I0831 15:36:27.939581    3744 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:36:27.939767    3744 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:36:27.939836    3744 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:36:27.939848    3744 certs.go:256] generating profile certs ...
	I0831 15:36:27.939960    3744 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:36:27.939980    3744 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.f0a126f7
	I0831 15:36:27.939996    3744 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.f0a126f7 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.7 192.169.0.254]
	I0831 15:36:27.990143    3744 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.f0a126f7 ...
	I0831 15:36:27.990157    3744 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.f0a126f7: {Name:mkcaa83b4b223ea37e242b23bc80c554e3269eac Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:36:27.990861    3744 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.f0a126f7 ...
	I0831 15:36:27.990872    3744 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.f0a126f7: {Name:mk789cab6bc4fccb81a6d827e090943e3a032cb6 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:36:27.991117    3744 certs.go:381] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.f0a126f7 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt
	I0831 15:36:27.991353    3744 certs.go:385] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.f0a126f7 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key
	I0831 15:36:27.991605    3744 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:36:27.991615    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:36:27.991642    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:36:27.991663    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:36:27.991688    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:36:27.991706    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:36:27.991724    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:36:27.991744    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:36:27.991761    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:36:27.991852    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:36:27.991900    3744 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:36:27.991909    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:36:27.991937    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:36:27.991968    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:36:27.992001    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:36:27.992071    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:36:27.992107    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:36:27.992134    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:27.992153    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:36:27.992665    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:36:28.012619    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:36:28.037918    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:36:28.059676    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:36:28.085374    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0831 15:36:28.108665    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0831 15:36:28.134880    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:36:28.163351    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:36:28.189443    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:36:28.237208    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:36:28.275840    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:36:28.307738    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0831 15:36:28.327147    3744 ssh_runner.go:195] Run: openssl version
	I0831 15:36:28.332485    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:36:28.341869    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:28.345319    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:28.345361    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:28.356453    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:36:28.366034    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:36:28.375170    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:36:28.378621    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:36:28.378656    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:36:28.382855    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:36:28.392032    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:36:28.401330    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:36:28.404932    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:36:28.404981    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:36:28.409135    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:36:28.418467    3744 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:36:28.421857    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0831 15:36:28.426311    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0831 15:36:28.430575    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0831 15:36:28.435252    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0831 15:36:28.439597    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0831 15:36:28.443958    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0831 15:36:28.448329    3744 kubeadm.go:392] StartCluster: {Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 C
lusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false fres
hpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:do
cker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:36:28.448445    3744 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0831 15:36:28.461457    3744 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0831 15:36:28.469983    3744 kubeadm.go:408] found existing configuration files, will attempt cluster restart
	I0831 15:36:28.469994    3744 kubeadm.go:593] restartPrimaryControlPlane start ...
	I0831 15:36:28.470033    3744 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0831 15:36:28.478435    3744 kubeadm.go:130] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:36:28.478738    3744 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-949000" does not appear in /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:36:28.478830    3744 kubeconfig.go:62] /Users/jenkins/minikube-integration/18943-957/kubeconfig needs updating (will repair): [kubeconfig missing "ha-949000" cluster setting kubeconfig missing "ha-949000" context setting]
	I0831 15:36:28.479071    3744 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/kubeconfig: {Name:mkc7259a3f17d77b84078e55eed4ed8b5d2486ca Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:36:28.479445    3744 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:36:28.479626    3744 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.5:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, Use
rAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0xfc63c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0831 15:36:28.479933    3744 cert_rotation.go:140] Starting client certificate rotation controller
	I0831 15:36:28.480130    3744 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0831 15:36:28.488296    3744 kubeadm.go:630] The running cluster does not require reconfiguration: 192.169.0.5
	I0831 15:36:28.488308    3744 kubeadm.go:597] duration metric: took 18.310201ms to restartPrimaryControlPlane
	I0831 15:36:28.488312    3744 kubeadm.go:394] duration metric: took 39.987749ms to StartCluster
	I0831 15:36:28.488320    3744 settings.go:142] acquiring lock: {Name:mk4b1b0a7439feab82be8f6d66b4d3c4d11c9b5f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:36:28.488392    3744 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:36:28.488767    3744 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/kubeconfig: {Name:mkc7259a3f17d77b84078e55eed4ed8b5d2486ca Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:36:28.488978    3744 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:36:28.488992    3744 start.go:241] waiting for startup goroutines ...
	I0831 15:36:28.489001    3744 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0831 15:36:28.489144    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:36:28.531040    3744 out.go:177] * Enabled addons: 
	I0831 15:36:28.551931    3744 addons.go:510] duration metric: took 62.927579ms for enable addons: enabled=[]
	I0831 15:36:28.552016    3744 start.go:246] waiting for cluster config update ...
	I0831 15:36:28.552028    3744 start.go:255] writing updated cluster config ...
	I0831 15:36:28.574130    3744 out.go:201] 
	I0831 15:36:28.595598    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:36:28.595734    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:36:28.618331    3744 out.go:177] * Starting "ha-949000-m02" control-plane node in "ha-949000" cluster
	I0831 15:36:28.659956    3744 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:36:28.659989    3744 cache.go:56] Caching tarball of preloaded images
	I0831 15:36:28.660178    3744 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:36:28.660194    3744 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:36:28.660319    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:36:28.661341    3744 start.go:360] acquireMachinesLock for ha-949000-m02: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:36:28.661436    3744 start.go:364] duration metric: took 71.648µs to acquireMachinesLock for "ha-949000-m02"
	I0831 15:36:28.661461    3744 start.go:96] Skipping create...Using existing machine configuration
	I0831 15:36:28.661470    3744 fix.go:54] fixHost starting: m02
	I0831 15:36:28.661902    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:36:28.661926    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:36:28.670964    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51824
	I0831 15:36:28.671287    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:36:28.671608    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:36:28.671619    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:36:28.671857    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:36:28.671991    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:28.672109    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:36:28.672201    3744 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:36:28.672291    3744 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 3528
	I0831 15:36:28.673213    3744 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid 3528 missing from process table
	I0831 15:36:28.673240    3744 fix.go:112] recreateIfNeeded on ha-949000-m02: state=Stopped err=<nil>
	I0831 15:36:28.673248    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	W0831 15:36:28.673335    3744 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 15:36:28.714811    3744 out.go:177] * Restarting existing hyperkit VM for "ha-949000-m02" ...
	I0831 15:36:28.736047    3744 main.go:141] libmachine: (ha-949000-m02) Calling .Start
	I0831 15:36:28.736403    3744 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:36:28.736434    3744 main.go:141] libmachine: (ha-949000-m02) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid
	I0831 15:36:28.738213    3744 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid 3528 missing from process table
	I0831 15:36:28.738226    3744 main.go:141] libmachine: (ha-949000-m02) DBG | pid 3528 is in state "Stopped"
	I0831 15:36:28.738249    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid...
	I0831 15:36:28.738619    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Using UUID 23e5d675-5201-4f3d-86b7-b25c818528d1
	I0831 15:36:28.765315    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Generated MAC 92:7:3c:3f:ee:b7
	I0831 15:36:28.765332    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:36:28.765455    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"23e5d675-5201-4f3d-86b7-b25c818528d1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c0a20)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:36:28.765495    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"23e5d675-5201-4f3d-86b7-b25c818528d1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c0a20)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:36:28.765521    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "23e5d675-5201-4f3d-86b7-b25c818528d1", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:36:28.765553    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 23e5d675-5201-4f3d-86b7-b25c818528d1 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:36:28.765562    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:36:28.767165    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 DEBUG: hyperkit: Pid is 3763
	I0831 15:36:28.767495    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 0
	I0831 15:36:28.767509    3744 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:36:28.767583    3744 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 3763
	I0831 15:36:28.769355    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:36:28.769415    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0831 15:36:28.769450    3744 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4ec63}
	I0831 15:36:28.769473    3744 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ebe4}
	I0831 15:36:28.769487    3744 main.go:141] libmachine: (ha-949000-m02) DBG | Found match: 92:7:3c:3f:ee:b7
	I0831 15:36:28.769498    3744 main.go:141] libmachine: (ha-949000-m02) DBG | IP: 192.169.0.6
	I0831 15:36:28.769505    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetConfigRaw
	I0831 15:36:28.770167    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:36:28.770374    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:36:28.770722    3744 machine.go:93] provisionDockerMachine start ...
	I0831 15:36:28.770732    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:28.770845    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:28.770937    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:28.771045    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:28.771147    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:28.771273    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:28.771413    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:28.771572    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:36:28.771580    3744 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 15:36:28.775197    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:36:28.783845    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:36:28.784655    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:36:28.784674    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:36:28.784685    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:36:28.784693    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:28 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:36:29.168717    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:36:29.168732    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:36:29.283641    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:36:29.283661    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:36:29.283712    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:36:29.283753    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:36:29.284560    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:36:29.284571    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:29 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:36:34.866750    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:34 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0831 15:36:34.866767    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:34 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0831 15:36:34.866778    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:34 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0831 15:36:34.891499    3744 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:36:34 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0831 15:36:39.840129    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 15:36:39.840143    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:36:39.840307    3744 buildroot.go:166] provisioning hostname "ha-949000-m02"
	I0831 15:36:39.840319    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:36:39.840413    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:39.840489    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:39.840578    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:39.840665    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:39.840764    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:39.840907    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:39.841055    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:36:39.841064    3744 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m02 && echo "ha-949000-m02" | sudo tee /etc/hostname
	I0831 15:36:39.913083    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m02
	
	I0831 15:36:39.913098    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:39.913252    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:39.913377    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:39.913471    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:39.913560    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:39.913685    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:39.913826    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:36:39.913837    3744 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:36:39.987034    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:36:39.987048    3744 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:36:39.987056    3744 buildroot.go:174] setting up certificates
	I0831 15:36:39.987062    3744 provision.go:84] configureAuth start
	I0831 15:36:39.987067    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:36:39.987204    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:36:39.987310    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:39.987418    3744 provision.go:143] copyHostCerts
	I0831 15:36:39.987447    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:36:39.987493    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:36:39.987499    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:36:39.988044    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:36:39.988241    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:36:39.988272    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:36:39.988277    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:36:39.988347    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:36:39.988492    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:36:39.988529    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:36:39.988533    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:36:39.988597    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:36:39.988746    3744 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m02 san=[127.0.0.1 192.169.0.6 ha-949000-m02 localhost minikube]
	I0831 15:36:40.055665    3744 provision.go:177] copyRemoteCerts
	I0831 15:36:40.055717    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:36:40.055733    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:40.055998    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:40.056098    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:40.056185    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:40.056277    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:36:40.095370    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:36:40.095446    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:36:40.115272    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:36:40.115336    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:36:40.134845    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:36:40.134920    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0831 15:36:40.154450    3744 provision.go:87] duration metric: took 167.380587ms to configureAuth
	I0831 15:36:40.154464    3744 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:36:40.154620    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:36:40.154633    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:40.154762    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:40.154852    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:40.154930    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:40.155003    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:40.155112    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:40.155216    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:40.155334    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:36:40.155341    3744 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:36:40.220781    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:36:40.220794    3744 buildroot.go:70] root file system type: tmpfs
	I0831 15:36:40.220873    3744 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:36:40.220884    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:40.221013    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:40.221103    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:40.221194    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:40.221272    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:40.221400    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:40.221546    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:36:40.221589    3744 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:36:40.298646    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:36:40.298663    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:40.298789    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:40.298885    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:40.298979    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:40.299063    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:40.299201    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:40.299341    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:36:40.299353    3744 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:36:41.956479    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:36:41.956495    3744 machine.go:96] duration metric: took 13.1856235s to provisionDockerMachine
	I0831 15:36:41.956502    3744 start.go:293] postStartSetup for "ha-949000-m02" (driver="hyperkit")
	I0831 15:36:41.956508    3744 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:36:41.956522    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:41.956703    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:36:41.956716    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:41.956812    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:41.956896    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:41.956992    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:41.957077    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:36:42.000050    3744 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:36:42.004306    3744 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:36:42.004318    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:36:42.004439    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:36:42.004572    3744 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:36:42.004578    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:36:42.004735    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:36:42.017617    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:36:42.041071    3744 start.go:296] duration metric: took 84.560659ms for postStartSetup
	I0831 15:36:42.041107    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:42.041300    3744 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0831 15:36:42.041313    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:42.041398    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:42.041504    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:42.041609    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:42.041700    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:36:42.081048    3744 machine.go:197] restoring vm config from /var/lib/minikube/backup: [etc]
	I0831 15:36:42.081113    3744 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0831 15:36:42.134445    3744 fix.go:56] duration metric: took 13.472828598s for fixHost
	I0831 15:36:42.134470    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:42.134618    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:42.134730    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:42.134822    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:42.134900    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:42.135030    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:36:42.135170    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:36:42.135178    3744 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:36:42.199131    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143802.088359974
	
	I0831 15:36:42.199142    3744 fix.go:216] guest clock: 1725143802.088359974
	I0831 15:36:42.199147    3744 fix.go:229] Guest: 2024-08-31 15:36:42.088359974 -0700 PDT Remote: 2024-08-31 15:36:42.13446 -0700 PDT m=+32.407831620 (delta=-46.100026ms)
	I0831 15:36:42.199164    3744 fix.go:200] guest clock delta is within tolerance: -46.100026ms
	I0831 15:36:42.199169    3744 start.go:83] releasing machines lock for "ha-949000-m02", held for 13.537577271s
	I0831 15:36:42.199184    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:42.199330    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:36:42.220967    3744 out.go:177] * Found network options:
	I0831 15:36:42.242795    3744 out.go:177]   - NO_PROXY=192.169.0.5
	W0831 15:36:42.265056    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:36:42.265093    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:42.265983    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:42.266241    3744 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:36:42.266370    3744 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:36:42.266410    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	W0831 15:36:42.266454    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:36:42.266575    3744 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:36:42.266625    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:36:42.266633    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:42.266836    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:36:42.266871    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:42.267025    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:36:42.267062    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:42.267162    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:36:42.267189    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:36:42.267302    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	W0831 15:36:42.303842    3744 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:36:42.303902    3744 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:36:42.349152    3744 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:36:42.349174    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:36:42.349280    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:36:42.365129    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:36:42.373393    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:36:42.381789    3744 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:36:42.381831    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:36:42.389963    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:36:42.398325    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:36:42.406574    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:36:42.414917    3744 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:36:42.423513    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:36:42.431936    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:36:42.440352    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:36:42.449208    3744 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:36:42.457008    3744 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:36:42.464909    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:42.567905    3744 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:36:42.588297    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:36:42.588366    3744 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:36:42.602440    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:36:42.618217    3744 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:36:42.633678    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:36:42.645147    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:36:42.656120    3744 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:36:42.679235    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:36:42.690584    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:36:42.706263    3744 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:36:42.709220    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:36:42.717254    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:36:42.730693    3744 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:36:42.826051    3744 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:36:42.930594    3744 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:36:42.930623    3744 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:36:42.944719    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:43.038034    3744 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:36:45.352340    3744 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.314261795s)
	I0831 15:36:45.352402    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:36:45.362569    3744 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:36:45.374992    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:36:45.385146    3744 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:36:45.481701    3744 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:36:45.590417    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:45.703387    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:36:45.717135    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:36:45.728130    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:45.822749    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:36:45.893539    3744 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:36:45.893614    3744 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:36:45.898396    3744 start.go:563] Will wait 60s for crictl version
	I0831 15:36:45.898450    3744 ssh_runner.go:195] Run: which crictl
	I0831 15:36:45.901472    3744 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:36:45.929873    3744 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:36:45.929947    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:36:45.947410    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:36:45.987982    3744 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:36:46.029879    3744 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:36:46.051790    3744 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:36:46.052207    3744 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:36:46.056767    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:36:46.066419    3744 mustload.go:65] Loading cluster: ha-949000
	I0831 15:36:46.066592    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:36:46.066799    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:36:46.066820    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:36:46.075457    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51846
	I0831 15:36:46.075806    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:36:46.076162    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:36:46.076180    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:36:46.076408    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:36:46.076531    3744 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:36:46.076614    3744 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:36:46.076682    3744 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 3756
	I0831 15:36:46.077630    3744 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:36:46.077872    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:36:46.077895    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:36:46.086285    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51848
	I0831 15:36:46.086630    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:36:46.086945    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:36:46.086955    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:36:46.087205    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:36:46.087313    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:36:46.087418    3744 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.6
	I0831 15:36:46.087426    3744 certs.go:194] generating shared ca certs ...
	I0831 15:36:46.087439    3744 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:36:46.087575    3744 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:36:46.087627    3744 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:36:46.087636    3744 certs.go:256] generating profile certs ...
	I0831 15:36:46.087739    3744 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:36:46.087826    3744 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.e26aa346
	I0831 15:36:46.087882    3744 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:36:46.087890    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:36:46.087915    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:36:46.087944    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:36:46.087962    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:36:46.087979    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:36:46.087997    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:36:46.088015    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:36:46.088032    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:36:46.088113    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:36:46.088150    3744 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:36:46.088158    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:36:46.088191    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:36:46.088226    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:36:46.088254    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:36:46.088317    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:36:46.088349    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:46.088368    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:36:46.088390    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:36:46.088420    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:36:46.088505    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:36:46.088596    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:36:46.088688    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:36:46.088763    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:36:46.117725    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0831 15:36:46.121346    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0831 15:36:46.129782    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0831 15:36:46.133012    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0831 15:36:46.141510    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0831 15:36:46.144605    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0831 15:36:46.152913    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0831 15:36:46.156010    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1679 bytes)
	I0831 15:36:46.165156    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0831 15:36:46.168250    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0831 15:36:46.176838    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0831 15:36:46.179929    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0831 15:36:46.189075    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:36:46.209492    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:36:46.229359    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:36:46.249285    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:36:46.268964    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0831 15:36:46.288566    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0831 15:36:46.308035    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:36:46.327968    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:36:46.347874    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:36:46.367538    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:36:46.387135    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:36:46.406841    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0831 15:36:46.420747    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0831 15:36:46.434267    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0831 15:36:46.447929    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1679 bytes)
	I0831 15:36:46.461487    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0831 15:36:46.475040    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0831 15:36:46.488728    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0831 15:36:46.502198    3744 ssh_runner.go:195] Run: openssl version
	I0831 15:36:46.506532    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:36:46.514857    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:36:46.518202    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:36:46.518240    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:36:46.522435    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:36:46.530730    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:36:46.538900    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:36:46.542200    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:36:46.542233    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:36:46.546382    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:36:46.554646    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:36:46.562775    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:46.566092    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:46.566127    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:36:46.570335    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:36:46.578778    3744 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:36:46.582068    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0831 15:36:46.586501    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0831 15:36:46.590751    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0831 15:36:46.594979    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0831 15:36:46.599120    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0831 15:36:46.603290    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0831 15:36:46.607503    3744 kubeadm.go:934] updating node {m02 192.169.0.6 8443 v1.31.0 docker true true} ...
	I0831 15:36:46.607561    3744 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.6
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:36:46.607581    3744 kube-vip.go:115] generating kube-vip config ...
	I0831 15:36:46.607619    3744 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:36:46.620005    3744 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:36:46.620042    3744 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:36:46.620097    3744 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:36:46.627507    3744 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:36:46.627555    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0831 15:36:46.634842    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:36:46.648529    3744 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:36:46.661781    3744 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:36:46.675402    3744 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:36:46.678250    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:36:46.687467    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:46.779379    3744 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:36:46.793112    3744 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:36:46.793294    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:36:46.814624    3744 out.go:177] * Verifying Kubernetes components...
	I0831 15:36:46.835323    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:36:46.948649    3744 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:36:46.960452    3744 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:36:46.960657    3744 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0xfc63c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:36:46.960690    3744 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:36:46.960842    3744 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m02" to be "Ready" ...
	I0831 15:36:46.960927    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:46.960932    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:46.960940    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:46.960943    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.801259    3744 round_trippers.go:574] Response Status: 200 OK in 8840 milliseconds
	I0831 15:36:55.802034    3744 node_ready.go:49] node "ha-949000-m02" has status "Ready":"True"
	I0831 15:36:55.802046    3744 node_ready.go:38] duration metric: took 8.841092254s for node "ha-949000-m02" to be "Ready" ...
	I0831 15:36:55.802051    3744 pod_ready.go:36] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:36:55.802085    3744 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I0831 15:36:55.802094    3744 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I0831 15:36:55.802131    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:36:55.802136    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.802142    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.802147    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.817181    3744 round_trippers.go:574] Response Status: 200 OK in 15 milliseconds
	I0831 15:36:55.823106    3744 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.823166    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:36:55.823172    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.823178    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.823182    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.833336    3744 round_trippers.go:574] Response Status: 200 OK in 10 milliseconds
	I0831 15:36:55.833806    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:36:55.833817    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.833824    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.833829    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.843262    3744 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0831 15:36:55.843562    3744 pod_ready.go:93] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"True"
	I0831 15:36:55.843572    3744 pod_ready.go:82] duration metric: took 20.449445ms for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.843595    3744 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.843648    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-snq8s
	I0831 15:36:55.843655    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.843662    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.843667    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.846571    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:36:55.846969    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:36:55.846976    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.846982    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.846985    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.848597    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:55.848912    3744 pod_ready.go:93] pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace has status "Ready":"True"
	I0831 15:36:55.848921    3744 pod_ready.go:82] duration metric: took 5.319208ms for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.848934    3744 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.848970    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000
	I0831 15:36:55.848975    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.848981    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.848985    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.850738    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:55.851195    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:36:55.851203    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.851209    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.851212    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.852625    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:55.853038    3744 pod_ready.go:93] pod "etcd-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:36:55.853047    3744 pod_ready.go:82] duration metric: took 4.107015ms for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.853053    3744 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.853087    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m02
	I0831 15:36:55.853092    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.853100    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.853104    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.854440    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:55.854845    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:55.854852    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.854858    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.854861    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.856182    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:55.856534    3744 pod_ready.go:93] pod "etcd-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:36:55.856542    3744 pod_ready.go:82] duration metric: took 3.483952ms for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.856548    3744 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:55.856578    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m03
	I0831 15:36:55.856582    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:55.856588    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:55.856592    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:55.858303    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:56.003107    3744 request.go:632] Waited for 144.429757ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:36:56.003176    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:36:56.003183    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:56.003189    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:56.003193    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:56.004813    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:56.005140    3744 pod_ready.go:93] pod "etcd-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:36:56.005149    3744 pod_ready.go:82] duration metric: took 148.59533ms for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:56.005160    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:56.202344    3744 request.go:632] Waited for 197.12667ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:36:56.202386    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:36:56.202417    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:56.202425    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:56.202428    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:56.205950    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:56.403821    3744 request.go:632] Waited for 197.364477ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:36:56.403986    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:36:56.403997    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:56.404008    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:56.404017    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:56.407269    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:56.407644    3744 pod_ready.go:98] node "ha-949000" hosting pod "kube-apiserver-ha-949000" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-949000" has status "Ready":"False"
	I0831 15:36:56.407658    3744 pod_ready.go:82] duration metric: took 402.487822ms for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	E0831 15:36:56.407673    3744 pod_ready.go:67] WaitExtra: waitPodCondition: node "ha-949000" hosting pod "kube-apiserver-ha-949000" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-949000" has status "Ready":"False"
	I0831 15:36:56.407681    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:56.602890    3744 request.go:632] Waited for 195.157951ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:36:56.602980    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:36:56.602991    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:56.603003    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:56.603010    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:56.606100    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:56.802222    3744 request.go:632] Waited for 195.71026ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:56.802289    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:56.802295    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:56.802301    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:56.802305    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:56.804612    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:36:56.804914    3744 pod_ready.go:93] pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:36:56.804923    3744 pod_ready.go:82] duration metric: took 397.232028ms for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:56.804930    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:57.003522    3744 request.go:632] Waited for 198.554376ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:36:57.003559    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:36:57.003600    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:57.003608    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:57.003618    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:57.005675    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:36:57.203456    3744 request.go:632] Waited for 197.402218ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:36:57.203520    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:36:57.203526    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:57.203532    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:57.203537    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:57.206124    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:36:57.206516    3744 pod_ready.go:93] pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:36:57.206526    3744 pod_ready.go:82] duration metric: took 401.586021ms for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:57.206534    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:57.402973    3744 request.go:632] Waited for 196.400032ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:36:57.403011    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:36:57.403017    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:57.403051    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:57.403056    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:57.405260    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:36:57.603636    3744 request.go:632] Waited for 197.987151ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:36:57.603708    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:36:57.603713    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:57.603719    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:57.603724    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:57.606022    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:36:57.606364    3744 pod_ready.go:98] node "ha-949000" hosting pod "kube-controller-manager-ha-949000" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-949000" has status "Ready":"False"
	I0831 15:36:57.606376    3744 pod_ready.go:82] duration metric: took 399.83214ms for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	E0831 15:36:57.606383    3744 pod_ready.go:67] WaitExtra: waitPodCondition: node "ha-949000" hosting pod "kube-controller-manager-ha-949000" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-949000" has status "Ready":"False"
	I0831 15:36:57.606388    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:36:57.802885    3744 request.go:632] Waited for 196.449707ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:36:57.803017    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:36:57.803028    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:57.803039    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:57.803046    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:57.806339    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:58.003449    3744 request.go:632] Waited for 196.421818ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:58.003513    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:58.003518    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:58.003524    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:58.003527    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:58.005621    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:36:58.203691    3744 request.go:632] Waited for 95.498322ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:36:58.203749    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:36:58.203758    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:58.203763    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:58.203766    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:58.207046    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:58.403784    3744 request.go:632] Waited for 196.241368ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:58.403948    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:58.403963    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:58.403974    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:58.404010    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:58.407767    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:58.608224    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:36:58.608245    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:58.608257    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:58.608265    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:58.611367    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:58.802284    3744 request.go:632] Waited for 190.220665ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:58.802382    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:58.802393    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:58.802407    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:58.802421    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:58.806173    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:59.108214    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:36:59.108238    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:59.108248    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:59.108332    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:59.111913    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:59.202533    3744 request.go:632] Waited for 89.639104ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:59.202672    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:59.202684    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:59.202693    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:59.202700    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:59.205790    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:59.608244    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:36:59.608308    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:59.608333    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:59.608346    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:59.611536    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:36:59.612038    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:36:59.612050    3744 round_trippers.go:469] Request Headers:
	I0831 15:36:59.612056    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:36:59.612059    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:36:59.613486    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:36:59.613797    3744 pod_ready.go:103] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:00.108234    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:00.108258    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:00.108269    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:00.108276    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:00.112243    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:00.112803    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:00.112811    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:00.112816    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:00.112819    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:00.114922    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:00.608266    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:00.608291    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:00.608340    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:00.608348    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:00.611571    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:00.612033    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:00.612041    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:00.612047    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:00.612051    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:00.614268    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:01.108244    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:01.108270    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:01.108282    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:01.108287    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:01.112176    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:01.112688    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:01.112697    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:01.112703    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:01.112706    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:01.114756    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:01.608252    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:01.608269    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:01.608303    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:01.608308    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:01.610548    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:01.610932    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:01.610940    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:01.610946    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:01.610951    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:01.612574    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:02.108349    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:02.108375    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:02.108386    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:02.108392    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:02.111907    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:02.112645    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:02.112653    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:02.112658    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:02.112662    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:02.114143    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:02.114439    3744 pod_ready.go:103] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:02.608228    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:02.608245    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:02.608252    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:02.608256    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:02.610772    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:02.611191    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:02.611199    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:02.611206    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:02.611210    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:02.613037    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:03.108219    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:03.108235    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:03.108241    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:03.108250    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:03.111668    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:03.112196    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:03.112204    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:03.112211    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:03.112214    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:03.114279    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:03.608402    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:03.608463    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:03.608509    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:03.608524    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:03.611720    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:03.612413    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:03.612424    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:03.612432    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:03.612436    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:03.615410    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:04.108309    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:04.108328    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:04.108337    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:04.108341    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:04.115334    3744 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:37:04.115796    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:04.115804    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:04.115815    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:04.115818    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:04.122611    3744 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:37:04.122876    3744 pod_ready.go:103] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:04.608750    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:04.608825    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:04.608840    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:04.608846    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:04.612925    3744 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:37:04.613492    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:04.613499    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:04.613505    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:04.613509    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:04.614977    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:05.106817    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:05.106842    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:05.106852    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:05.106859    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:05.110466    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:05.111095    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:05.111106    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:05.111113    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:05.111117    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:05.112615    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:05.608187    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:05.608211    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:05.608224    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:05.608248    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:05.611732    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:05.612260    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:05.612270    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:05.612278    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:05.612284    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:05.614120    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:06.107506    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:06.107527    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:06.107540    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:06.107545    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:06.110547    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:06.111218    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:06.111229    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:06.111237    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:06.111242    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:06.112971    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:06.607368    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:06.607380    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:06.607386    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:06.607391    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:06.609787    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:06.610207    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:06.610215    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:06.610221    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:06.610224    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:06.611989    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:06.612289    3744 pod_ready.go:103] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:07.107726    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:07.107744    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:07.107773    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:07.107777    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:07.109482    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:07.109930    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:07.109937    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:07.109943    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:07.109947    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:07.111448    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:07.607689    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:07.607742    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:07.607753    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:07.607759    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:07.610882    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:07.611345    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:07.611353    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:07.611359    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:07.611369    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:07.613392    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:08.107409    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:08.107435    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:08.107446    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:08.107451    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:08.111199    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:08.111808    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:08.111815    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:08.111820    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:08.111825    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:08.113569    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:08.607450    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:08.607477    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:08.607489    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:08.607494    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:08.611034    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:08.611547    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:08.611557    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:08.611563    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:08.611568    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:08.613347    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:08.613756    3744 pod_ready.go:103] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:09.108698    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:09.108730    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:09.108778    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:09.108791    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:09.112115    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:09.112783    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:09.112791    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:09.112796    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:09.112803    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:09.114417    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:09.606780    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:09.606804    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:09.606816    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:09.606824    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:09.609915    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:09.610481    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:09.610488    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:09.610494    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:09.610497    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:09.612172    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:10.106727    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:10.106745    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:10.106779    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:10.106786    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:10.109423    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:10.109937    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:10.109944    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:10.109950    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:10.109953    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:10.111717    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:10.607619    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:10.607642    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:10.607653    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:10.607658    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:10.610928    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:10.611460    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:10.611467    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:10.611472    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:10.611475    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:10.613024    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:11.108825    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:11.108848    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:11.108859    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:11.108865    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:11.112708    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:11.113184    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:11.113195    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:11.113202    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:11.113207    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:11.115187    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:11.116261    3744 pod_ready.go:103] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:11.607215    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:11.607243    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:11.607254    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:11.607261    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:11.611037    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:11.611547    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:11.611557    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:11.611565    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:11.611569    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:11.613373    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.108739    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:12.108764    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.108774    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.108779    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.112484    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:12.113117    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:12.113125    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.113131    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.113135    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.114878    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.608099    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:12.608124    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.608133    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.608140    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.611866    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:12.612563    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:12.612571    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.612577    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.612581    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.614297    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.614794    3744 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:12.614803    3744 pod_ready.go:82] duration metric: took 15.008248116s for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.614810    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.614849    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:37:12.614854    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.614860    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.614864    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.617726    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:12.618084    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:12.618092    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.618097    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.618100    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.619622    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.620160    3744 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:12.620169    3744 pod_ready.go:82] duration metric: took 5.352553ms for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.620175    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.620212    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:37:12.620217    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.620222    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.620225    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.624634    3744 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:37:12.625059    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:12.625066    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.625071    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.625074    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.626559    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.626901    3744 pod_ready.go:93] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:12.626910    3744 pod_ready.go:82] duration metric: took 6.729281ms for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.626916    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.626951    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-d45q5
	I0831 15:37:12.626956    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.626961    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.626964    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.628480    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.628945    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:12.628956    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.628961    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.628965    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.630425    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.630760    3744 pod_ready.go:93] pod "kube-proxy-d45q5" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:12.630769    3744 pod_ready.go:82] duration metric: took 3.847336ms for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.630775    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:12.630807    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:12.630812    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.630817    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.630821    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.632536    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:12.633060    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:12.633067    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:12.633072    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:12.633077    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:12.634424    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:13.132549    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:13.132573    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:13.132585    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:13.132591    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:13.135680    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:13.136120    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:13.136128    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:13.136133    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:13.136137    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:13.137931    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:13.632454    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:13.632468    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:13.632474    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:13.632477    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:13.634478    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:13.634979    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:13.634987    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:13.634992    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:13.634997    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:13.636493    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:14.132750    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:14.132776    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:14.132788    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:14.132794    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:14.136342    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:14.136985    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:14.136993    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:14.136999    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:14.137002    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:14.139021    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:14.630998    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:14.631010    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:14.631017    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:14.631019    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:14.637296    3744 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:37:14.637754    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:14.637761    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:14.637767    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:14.637770    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:14.645976    3744 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0831 15:37:14.646303    3744 pod_ready.go:103] pod "kube-proxy-q7ndn" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:15.131375    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:15.131389    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:15.131395    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:15.131398    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:15.136989    3744 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0831 15:37:15.137543    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:15.137552    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:15.137557    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:15.137561    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:15.145480    3744 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0831 15:37:15.631037    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:15.631049    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:15.631056    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:15.631060    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:15.650939    3744 round_trippers.go:574] Response Status: 200 OK in 19 milliseconds
	I0831 15:37:15.657344    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:15.657354    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:15.657360    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:15.657363    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:15.664319    3744 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:37:16.131044    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:16.131056    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:16.131062    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:16.131065    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:16.133359    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:16.133806    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:16.133815    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:16.133821    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:16.133835    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:16.135405    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:16.631836    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:16.631848    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:16.631854    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:16.631858    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:16.633942    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:16.634428    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:16.634436    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:16.634442    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:16.634449    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:16.636230    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:17.131746    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:17.131800    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.131814    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.131820    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.135452    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:17.136132    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:17.136139    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.136145    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.136148    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.137779    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:17.138135    3744 pod_ready.go:93] pod "kube-proxy-q7ndn" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:17.138143    3744 pod_ready.go:82] duration metric: took 4.507315671s for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:17.138150    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:17.138183    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:37:17.138187    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.138193    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.138198    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.140005    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:17.140372    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:17.140380    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.140385    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.140388    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.142052    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:17.142371    3744 pod_ready.go:93] pod "kube-scheduler-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:17.142380    3744 pod_ready.go:82] duration metric: took 4.22523ms for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:17.142387    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:17.142420    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:37:17.142425    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.142430    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.142433    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.144162    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:17.144573    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:17.144580    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.144585    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.144591    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.146052    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:17.146407    3744 pod_ready.go:93] pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:17.146415    3744 pod_ready.go:82] duration metric: took 4.022752ms for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:17.146422    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:17.208351    3744 request.go:632] Waited for 61.893937ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:37:17.208418    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:37:17.208435    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.208444    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.208449    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.211070    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:17.408566    3744 request.go:632] Waited for 197.051034ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:17.408606    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:17.408614    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.408622    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.408627    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.410767    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:17.411178    3744 pod_ready.go:93] pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:17.411187    3744 pod_ready.go:82] duration metric: took 264.75731ms for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:17.411194    3744 pod_ready.go:39] duration metric: took 21.608904421s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:37:17.411208    3744 api_server.go:52] waiting for apiserver process to appear ...
	I0831 15:37:17.411260    3744 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:37:17.423683    3744 api_server.go:72] duration metric: took 30.630215512s to wait for apiserver process to appear ...
	I0831 15:37:17.423694    3744 api_server.go:88] waiting for apiserver healthz status ...
	I0831 15:37:17.423707    3744 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0831 15:37:17.427947    3744 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0831 15:37:17.427987    3744 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0831 15:37:17.427992    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.427998    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.428008    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.428562    3744 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 15:37:17.428682    3744 api_server.go:141] control plane version: v1.31.0
	I0831 15:37:17.428691    3744 api_server.go:131] duration metric: took 4.99355ms to wait for apiserver health ...
	I0831 15:37:17.428699    3744 system_pods.go:43] waiting for kube-system pods to appear ...
	I0831 15:37:17.609319    3744 request.go:632] Waited for 180.546017ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:37:17.609356    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:37:17.609364    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.609372    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.609378    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.615729    3744 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:37:17.620529    3744 system_pods.go:59] 24 kube-system pods found
	I0831 15:37:17.620549    3744 system_pods.go:61] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0831 15:37:17.620557    3744 system_pods.go:61] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0831 15:37:17.620562    3744 system_pods.go:61] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:37:17.620566    3744 system_pods.go:61] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:37:17.620569    3744 system_pods.go:61] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:37:17.620572    3744 system_pods.go:61] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:37:17.620577    3744 system_pods.go:61] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:37:17.620581    3744 system_pods.go:61] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:37:17.620583    3744 system_pods.go:61] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:37:17.620586    3744 system_pods.go:61] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:37:17.620588    3744 system_pods.go:61] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:37:17.620593    3744 system_pods.go:61] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I0831 15:37:17.620596    3744 system_pods.go:61] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:37:17.620599    3744 system_pods.go:61] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:37:17.620602    3744 system_pods.go:61] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:37:17.620605    3744 system_pods.go:61] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:37:17.620607    3744 system_pods.go:61] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:37:17.620610    3744 system_pods.go:61] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:37:17.620612    3744 system_pods.go:61] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:37:17.620615    3744 system_pods.go:61] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:37:17.620617    3744 system_pods.go:61] "kube-vip-ha-949000" [98967a2c-6641-4193-b7ce-c0fbdee58344] Running
	I0831 15:37:17.620620    3744 system_pods.go:61] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:37:17.620622    3744 system_pods.go:61] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:37:17.620625    3744 system_pods.go:61] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:37:17.620628    3744 system_pods.go:74] duration metric: took 191.923916ms to wait for pod list to return data ...
	I0831 15:37:17.620634    3744 default_sa.go:34] waiting for default service account to be created ...
	I0831 15:37:17.808285    3744 request.go:632] Waited for 187.597884ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:37:17.808399    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:37:17.808411    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:17.808422    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:17.808429    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:17.812254    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:17.812385    3744 default_sa.go:45] found service account: "default"
	I0831 15:37:17.812394    3744 default_sa.go:55] duration metric: took 191.75371ms for default service account to be created ...
	I0831 15:37:17.812410    3744 system_pods.go:116] waiting for k8s-apps to be running ...
	I0831 15:37:18.009398    3744 request.go:632] Waited for 196.900555ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:37:18.009462    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:37:18.009503    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:18.009518    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:18.009526    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:18.017075    3744 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0831 15:37:18.022069    3744 system_pods.go:86] 24 kube-system pods found
	I0831 15:37:18.022087    3744 system_pods.go:89] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0831 15:37:18.022093    3744 system_pods.go:89] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0831 15:37:18.022097    3744 system_pods.go:89] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:37:18.022101    3744 system_pods.go:89] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:37:18.022105    3744 system_pods.go:89] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:37:18.022108    3744 system_pods.go:89] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:37:18.022111    3744 system_pods.go:89] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:37:18.022114    3744 system_pods.go:89] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:37:18.022117    3744 system_pods.go:89] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:37:18.022120    3744 system_pods.go:89] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:37:18.022123    3744 system_pods.go:89] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:37:18.022127    3744 system_pods.go:89] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I0831 15:37:18.022131    3744 system_pods.go:89] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:37:18.022134    3744 system_pods.go:89] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:37:18.022138    3744 system_pods.go:89] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:37:18.022140    3744 system_pods.go:89] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:37:18.022143    3744 system_pods.go:89] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:37:18.022146    3744 system_pods.go:89] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:37:18.022148    3744 system_pods.go:89] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:37:18.022152    3744 system_pods.go:89] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:37:18.022155    3744 system_pods.go:89] "kube-vip-ha-949000" [98967a2c-6641-4193-b7ce-c0fbdee58344] Running
	I0831 15:37:18.022157    3744 system_pods.go:89] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:37:18.022160    3744 system_pods.go:89] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:37:18.022162    3744 system_pods.go:89] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:37:18.022168    3744 system_pods.go:126] duration metric: took 209.74863ms to wait for k8s-apps to be running ...
	I0831 15:37:18.022173    3744 system_svc.go:44] waiting for kubelet service to be running ....
	I0831 15:37:18.022230    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:37:18.033610    3744 system_svc.go:56] duration metric: took 11.428501ms WaitForService to wait for kubelet
	I0831 15:37:18.033632    3744 kubeadm.go:582] duration metric: took 31.24015665s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:37:18.033647    3744 node_conditions.go:102] verifying NodePressure condition ...
	I0831 15:37:18.208845    3744 request.go:632] Waited for 175.149396ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0831 15:37:18.208908    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0831 15:37:18.208914    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:18.208921    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:18.208926    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:18.213884    3744 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:37:18.214480    3744 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:37:18.214495    3744 node_conditions.go:123] node cpu capacity is 2
	I0831 15:37:18.214504    3744 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:37:18.214507    3744 node_conditions.go:123] node cpu capacity is 2
	I0831 15:37:18.214510    3744 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:37:18.214513    3744 node_conditions.go:123] node cpu capacity is 2
	I0831 15:37:18.214516    3744 node_conditions.go:105] duration metric: took 180.864612ms to run NodePressure ...
	I0831 15:37:18.214525    3744 start.go:241] waiting for startup goroutines ...
	I0831 15:37:18.214542    3744 start.go:255] writing updated cluster config ...
	I0831 15:37:18.235038    3744 out.go:201] 
	I0831 15:37:18.272074    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:37:18.272141    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:37:18.293920    3744 out.go:177] * Starting "ha-949000-m03" control-plane node in "ha-949000" cluster
	I0831 15:37:18.336055    3744 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:37:18.336091    3744 cache.go:56] Caching tarball of preloaded images
	I0831 15:37:18.336291    3744 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:37:18.336317    3744 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:37:18.336472    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:37:18.337744    3744 start.go:360] acquireMachinesLock for ha-949000-m03: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:37:18.337863    3744 start.go:364] duration metric: took 91.481µs to acquireMachinesLock for "ha-949000-m03"
	I0831 15:37:18.337896    3744 start.go:96] Skipping create...Using existing machine configuration
	I0831 15:37:18.337907    3744 fix.go:54] fixHost starting: m03
	I0831 15:37:18.338304    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:37:18.338331    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:37:18.347585    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51853
	I0831 15:37:18.347933    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:37:18.348309    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:37:18.348325    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:37:18.348554    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:37:18.348680    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:18.348764    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetState
	I0831 15:37:18.348835    3744 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:37:18.348927    3744 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3227
	I0831 15:37:18.349821    3744 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid 3227 missing from process table
	I0831 15:37:18.349851    3744 fix.go:112] recreateIfNeeded on ha-949000-m03: state=Stopped err=<nil>
	I0831 15:37:18.349859    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	W0831 15:37:18.349928    3744 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 15:37:18.371074    3744 out.go:177] * Restarting existing hyperkit VM for "ha-949000-m03" ...
	I0831 15:37:18.413086    3744 main.go:141] libmachine: (ha-949000-m03) Calling .Start
	I0831 15:37:18.413447    3744 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:37:18.413507    3744 main.go:141] libmachine: (ha-949000-m03) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid
	I0831 15:37:18.415280    3744 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid 3227 missing from process table
	I0831 15:37:18.415294    3744 main.go:141] libmachine: (ha-949000-m03) DBG | pid 3227 is in state "Stopped"
	I0831 15:37:18.415313    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid...
	I0831 15:37:18.415660    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Using UUID 3fdefe95-7552-4d5b-8412-6ae6e5c787bb
	I0831 15:37:18.441752    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Generated MAC fa:59:9e:3b:35:6d
	I0831 15:37:18.441781    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:37:18.441964    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"3fdefe95-7552-4d5b-8412-6ae6e5c787bb", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00037b4a0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:37:18.442001    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"3fdefe95-7552-4d5b-8412-6ae6e5c787bb", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00037b4a0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:37:18.442067    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "3fdefe95-7552-4d5b-8412-6ae6e5c787bb", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/ha-949000-m03.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:37:18.442136    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 3fdefe95-7552-4d5b-8412-6ae6e5c787bb -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/ha-949000-m03.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:37:18.442155    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:37:18.443921    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 DEBUG: hyperkit: Pid is 3783
	I0831 15:37:18.444292    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Attempt 0
	I0831 15:37:18.444304    3744 main.go:141] libmachine: (ha-949000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:37:18.444362    3744 main.go:141] libmachine: (ha-949000-m03) DBG | hyperkit pid from json: 3783
	I0831 15:37:18.446124    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Searching for fa:59:9e:3b:35:6d in /var/db/dhcpd_leases ...
	I0831 15:37:18.446228    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0831 15:37:18.446248    3744 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ec75}
	I0831 15:37:18.446260    3744 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4ec63}
	I0831 15:37:18.446272    3744 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d4eb85}
	I0831 15:37:18.446306    3744 main.go:141] libmachine: (ha-949000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d4eb32}
	I0831 15:37:18.446321    3744 main.go:141] libmachine: (ha-949000-m03) DBG | Found match: fa:59:9e:3b:35:6d
	I0831 15:37:18.446335    3744 main.go:141] libmachine: (ha-949000-m03) DBG | IP: 192.169.0.7
	I0831 15:37:18.446363    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetConfigRaw
	I0831 15:37:18.447082    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:37:18.447293    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:37:18.447693    3744 machine.go:93] provisionDockerMachine start ...
	I0831 15:37:18.447703    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:18.447827    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:18.447958    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:18.448072    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:18.448161    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:18.448250    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:18.448355    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:37:18.448517    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:37:18.448526    3744 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 15:37:18.451810    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:37:18.461189    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:37:18.462060    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:37:18.462072    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:37:18.462081    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:37:18.462086    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:37:18.852728    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:37:18.852743    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:37:18.968113    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:37:18.968132    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:37:18.968140    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:37:18.968171    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:37:18.968968    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:37:18.968978    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:18 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:37:24.540624    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:24 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0831 15:37:24.540682    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:24 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0831 15:37:24.540695    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:24 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0831 15:37:24.565460    3744 main.go:141] libmachine: (ha-949000-m03) DBG | 2024/08/31 15:37:24 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0831 15:37:29.520863    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 15:37:29.520878    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:37:29.521004    3744 buildroot.go:166] provisioning hostname "ha-949000-m03"
	I0831 15:37:29.521015    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:37:29.521111    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:29.521203    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:29.521290    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.521386    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.521482    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:29.521612    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:37:29.521765    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:37:29.521776    3744 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m03 && echo "ha-949000-m03" | sudo tee /etc/hostname
	I0831 15:37:29.591531    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m03
	
	I0831 15:37:29.591551    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:29.591708    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:29.591803    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.591884    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.591995    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:29.592173    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:37:29.592330    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:37:29.592341    3744 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m03' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m03/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m03' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:37:29.658685    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:37:29.658701    3744 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:37:29.658714    3744 buildroot.go:174] setting up certificates
	I0831 15:37:29.658720    3744 provision.go:84] configureAuth start
	I0831 15:37:29.658727    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetMachineName
	I0831 15:37:29.658867    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:37:29.658966    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:29.659054    3744 provision.go:143] copyHostCerts
	I0831 15:37:29.659089    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:37:29.659140    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:37:29.659146    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:37:29.659263    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:37:29.659455    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:37:29.659484    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:37:29.659488    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:37:29.659564    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:37:29.659714    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:37:29.659747    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:37:29.659753    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:37:29.659818    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:37:29.659964    3744 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m03 san=[127.0.0.1 192.169.0.7 ha-949000-m03 localhost minikube]
	I0831 15:37:29.736089    3744 provision.go:177] copyRemoteCerts
	I0831 15:37:29.736163    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:37:29.736179    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:29.736322    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:29.736416    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.736504    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:29.736597    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:37:29.771590    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:37:29.771658    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:37:29.791254    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:37:29.791326    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:37:29.810923    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:37:29.810991    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0831 15:37:29.830631    3744 provision.go:87] duration metric: took 171.900577ms to configureAuth
	I0831 15:37:29.830645    3744 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:37:29.830811    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:37:29.830824    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:29.830954    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:29.831042    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:29.831126    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.831207    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.831289    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:29.831399    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:37:29.831522    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:37:29.831530    3744 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:37:29.892205    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:37:29.892217    3744 buildroot.go:70] root file system type: tmpfs
	I0831 15:37:29.892291    3744 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:37:29.892302    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:29.892426    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:29.892516    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.892609    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.892714    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:29.892838    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:37:29.892976    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:37:29.893022    3744 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:37:29.961258    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:37:29.961276    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:29.961414    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:29.961511    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.961619    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:29.961703    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:29.961817    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:37:29.961955    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:37:29.961967    3744 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:37:31.615783    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:37:31.615799    3744 machine.go:96] duration metric: took 13.167957184s to provisionDockerMachine
	I0831 15:37:31.615806    3744 start.go:293] postStartSetup for "ha-949000-m03" (driver="hyperkit")
	I0831 15:37:31.615814    3744 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:37:31.615823    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:31.616028    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:37:31.616046    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:31.616158    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:31.616258    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:31.616349    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:31.616481    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:37:31.654537    3744 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:37:31.657860    3744 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:37:31.657873    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:37:31.657960    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:37:31.658093    3744 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:37:31.658099    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:37:31.658258    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:37:31.672215    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:37:31.694606    3744 start.go:296] duration metric: took 78.79067ms for postStartSetup
	I0831 15:37:31.694628    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:31.694811    3744 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0831 15:37:31.694825    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:31.694916    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:31.695011    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:31.695099    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:31.695179    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:37:31.731833    3744 machine.go:197] restoring vm config from /var/lib/minikube/backup: [etc]
	I0831 15:37:31.731896    3744 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0831 15:37:31.763292    3744 fix.go:56] duration metric: took 13.425238964s for fixHost
	I0831 15:37:31.763317    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:31.763450    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:31.763540    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:31.763638    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:31.763730    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:31.763846    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:37:31.764005    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0831 15:37:31.764012    3744 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:37:31.823707    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143851.888011101
	
	I0831 15:37:31.823721    3744 fix.go:216] guest clock: 1725143851.888011101
	I0831 15:37:31.823727    3744 fix.go:229] Guest: 2024-08-31 15:37:31.888011101 -0700 PDT Remote: 2024-08-31 15:37:31.763307 -0700 PDT m=+82.036146513 (delta=124.704101ms)
	I0831 15:37:31.823737    3744 fix.go:200] guest clock delta is within tolerance: 124.704101ms
	I0831 15:37:31.823741    3744 start.go:83] releasing machines lock for "ha-949000-m03", held for 13.485720355s
	I0831 15:37:31.823765    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:31.823906    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:37:31.845130    3744 out.go:177] * Found network options:
	I0831 15:37:31.865299    3744 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6
	W0831 15:37:31.886126    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:37:31.886160    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:37:31.886178    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:31.886943    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:31.887142    3744 main.go:141] libmachine: (ha-949000-m03) Calling .DriverName
	I0831 15:37:31.887254    3744 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:37:31.887286    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	W0831 15:37:31.887368    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:37:31.887394    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:37:31.887504    3744 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:37:31.887511    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:31.887521    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHHostname
	I0831 15:37:31.887696    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHPort
	I0831 15:37:31.887743    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:31.887910    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHKeyPath
	I0831 15:37:31.887987    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:31.888104    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetSSHUsername
	I0831 15:37:31.888248    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	I0831 15:37:31.888351    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m03/id_rsa Username:docker}
	W0831 15:37:31.921752    3744 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:37:31.921817    3744 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:37:31.966799    3744 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:37:31.966823    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:37:31.966938    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:37:31.983482    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:37:31.992712    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:37:32.002010    3744 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:37:32.002056    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:37:32.011011    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:37:32.020061    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:37:32.028982    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:37:32.038569    3744 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:37:32.048027    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:37:32.057745    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:37:32.066832    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:37:32.075930    3744 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:37:32.084234    3744 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:37:32.092513    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:37:32.200002    3744 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:37:32.218717    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:37:32.218782    3744 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:37:32.234470    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:37:32.246859    3744 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:37:32.268072    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:37:32.279723    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:37:32.291270    3744 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:37:32.313992    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:37:32.325465    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:37:32.340891    3744 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:37:32.343755    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:37:32.351807    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:37:32.365348    3744 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:37:32.460495    3744 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:37:32.562594    3744 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:37:32.562619    3744 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:37:32.576763    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:37:32.677110    3744 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:37:34.994745    3744 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.317591857s)
	I0831 15:37:34.994823    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:37:35.005392    3744 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:37:35.018138    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:37:35.028648    3744 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:37:35.124983    3744 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:37:35.235732    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:37:35.346302    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:37:35.360082    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:37:35.370959    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:37:35.477096    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:37:35.544102    3744 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:37:35.544184    3744 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:37:35.548776    3744 start.go:563] Will wait 60s for crictl version
	I0831 15:37:35.548834    3744 ssh_runner.go:195] Run: which crictl
	I0831 15:37:35.551795    3744 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:37:35.578659    3744 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:37:35.578734    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:37:35.596206    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:37:35.640045    3744 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:37:35.682013    3744 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:37:35.703018    3744 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0831 15:37:35.723860    3744 main.go:141] libmachine: (ha-949000-m03) Calling .GetIP
	I0831 15:37:35.724174    3744 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:37:35.728476    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:37:35.738147    3744 mustload.go:65] Loading cluster: ha-949000
	I0831 15:37:35.738335    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:37:35.738551    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:37:35.738572    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:37:35.747642    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51875
	I0831 15:37:35.747990    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:37:35.748302    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:37:35.748315    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:37:35.748544    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:37:35.748655    3744 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:37:35.748733    3744 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:37:35.748808    3744 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 3756
	I0831 15:37:35.749749    3744 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:37:35.749998    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:37:35.750023    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:37:35.758673    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51877
	I0831 15:37:35.758994    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:37:35.759349    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:37:35.759365    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:37:35.759557    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:37:35.759653    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:37:35.759755    3744 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.7
	I0831 15:37:35.759761    3744 certs.go:194] generating shared ca certs ...
	I0831 15:37:35.759770    3744 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:37:35.759913    3744 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:37:35.759965    3744 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:37:35.759974    3744 certs.go:256] generating profile certs ...
	I0831 15:37:35.760073    3744 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:37:35.760161    3744 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.0c0868f3
	I0831 15:37:35.760221    3744 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:37:35.760228    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:37:35.760249    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:37:35.760273    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:37:35.760292    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:37:35.760308    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:37:35.760333    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:37:35.760352    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:37:35.760368    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:37:35.760450    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:37:35.760489    3744 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:37:35.760497    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:37:35.760534    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:37:35.760565    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:37:35.760594    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:37:35.760658    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:37:35.760694    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:37:35.760715    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:37:35.760733    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:37:35.760757    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:37:35.760839    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:37:35.760910    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:37:35.761012    3744 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:37:35.761091    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:37:35.789354    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0831 15:37:35.793275    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0831 15:37:35.801794    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0831 15:37:35.805175    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0831 15:37:35.813194    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0831 15:37:35.816357    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0831 15:37:35.824019    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0831 15:37:35.827176    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1679 bytes)
	I0831 15:37:35.835398    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0831 15:37:35.838546    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0831 15:37:35.847890    3744 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0831 15:37:35.851045    3744 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0831 15:37:35.858866    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:37:35.879287    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:37:35.899441    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:37:35.919810    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:37:35.940109    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0831 15:37:35.960051    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0831 15:37:35.979638    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:37:35.999504    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:37:36.019089    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:37:36.039173    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:37:36.058828    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:37:36.078456    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0831 15:37:36.092789    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0831 15:37:36.106379    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0831 15:37:36.119946    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1679 bytes)
	I0831 15:37:36.133839    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0831 15:37:36.148101    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0831 15:37:36.161739    3744 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0831 15:37:36.175159    3744 ssh_runner.go:195] Run: openssl version
	I0831 15:37:36.179390    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:37:36.187703    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:37:36.191071    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:37:36.191114    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:37:36.195292    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:37:36.203552    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:37:36.212239    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:37:36.215746    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:37:36.215790    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:37:36.219988    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:37:36.228608    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:37:36.237421    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:37:36.240805    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:37:36.240843    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:37:36.245119    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:37:36.253604    3744 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:37:36.256982    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0831 15:37:36.261329    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0831 15:37:36.265579    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0831 15:37:36.269756    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0831 15:37:36.273922    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0831 15:37:36.278236    3744 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0831 15:37:36.282870    3744 kubeadm.go:934] updating node {m03 192.169.0.7 8443 v1.31.0 docker true true} ...
	I0831 15:37:36.282943    3744 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m03 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.7
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:37:36.282961    3744 kube-vip.go:115] generating kube-vip config ...
	I0831 15:37:36.283008    3744 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:37:36.296221    3744 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:37:36.296272    3744 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:37:36.296330    3744 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:37:36.304482    3744 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:37:36.304539    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0831 15:37:36.311975    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:37:36.325288    3744 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:37:36.338951    3744 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:37:36.352501    3744 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:37:36.355411    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:37:36.364926    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:37:36.456418    3744 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:37:36.471558    3744 start.go:235] Will wait 6m0s for node &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:37:36.471752    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:37:36.529525    3744 out.go:177] * Verifying Kubernetes components...
	I0831 15:37:36.550389    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:37:36.691381    3744 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:37:36.709538    3744 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:37:36.709731    3744 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0xfc63c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:37:36.709775    3744 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:37:36.709942    3744 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m03" to be "Ready" ...
	I0831 15:37:36.709989    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:36.709994    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:36.710000    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:36.710003    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:36.712128    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:36.712576    3744 node_ready.go:49] node "ha-949000-m03" has status "Ready":"True"
	I0831 15:37:36.712585    3744 node_ready.go:38] duration metric: took 2.63459ms for node "ha-949000-m03" to be "Ready" ...
	I0831 15:37:36.712591    3744 pod_ready.go:36] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:37:36.712631    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:37:36.712638    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:36.712643    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:36.712650    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:36.716253    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:36.722917    3744 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:36.722974    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:36.722980    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:36.722986    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:36.722989    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:36.725559    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:36.726201    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:36.726209    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:36.726215    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:36.726231    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:36.728257    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:37.223697    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:37.223717    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:37.223728    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:37.223737    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:37.235316    3744 round_trippers.go:574] Response Status: 200 OK in 11 milliseconds
	I0831 15:37:37.236200    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:37.236213    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:37.236221    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:37.236224    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:37.238445    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:37.723177    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:37.723191    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:37.723198    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:37.723201    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:37.730411    3744 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0831 15:37:37.731034    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:37.731043    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:37.731048    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:37.731053    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:37.733549    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:38.223151    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:38.223168    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:38.223174    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:38.223177    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:38.225984    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:38.226378    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:38.226386    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:38.226391    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:38.226394    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:38.229300    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:38.724309    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:38.724325    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:38.724334    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:38.724337    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:38.726908    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:38.727435    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:38.727443    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:38.727449    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:38.727454    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:38.729651    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:38.730063    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:39.223582    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:39.223601    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:39.223608    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:39.223627    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:39.225990    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:39.226495    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:39.226503    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:39.226509    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:39.226514    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:39.228583    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:39.724043    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:39.724057    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:39.724068    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:39.724079    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:39.726325    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:39.726730    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:39.726738    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:39.726744    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:39.726748    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:39.728764    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:40.223977    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:40.223993    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:40.224000    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:40.224004    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:40.226279    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:40.226700    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:40.226708    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:40.226714    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:40.226718    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:40.228516    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:40.724602    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:40.724619    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:40.724628    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:40.724634    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:40.727418    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:40.727959    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:40.727966    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:40.727972    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:40.727983    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:40.729907    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:40.730276    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:41.223101    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:41.223117    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:41.223124    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:41.223128    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:41.225118    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:41.225750    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:41.225761    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:41.225768    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:41.225772    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:41.227757    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:41.724913    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:41.724940    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:41.724951    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:41.725035    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:41.728761    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:41.729240    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:41.729247    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:41.729252    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:41.729255    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:41.730912    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:42.224964    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:42.224989    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:42.225001    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:42.225006    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:42.228620    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:42.229196    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:42.229204    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:42.229210    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:42.229214    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:42.232307    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:42.725079    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:42.725106    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:42.725118    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:42.725128    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:42.728799    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:42.729409    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:42.729420    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:42.729429    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:42.729435    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:42.731172    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:42.731531    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:43.225019    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:43.225047    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:43.225060    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:43.225067    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:43.228808    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:43.229389    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:43.229399    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:43.229405    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:43.229409    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:43.231056    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:43.724985    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:43.725000    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:43.725006    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:43.725010    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:43.727056    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:43.727478    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:43.727485    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:43.727491    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:43.727494    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:43.729068    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:44.224095    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:44.224121    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:44.224133    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:44.224181    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:44.227349    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:44.228120    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:44.228128    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:44.228134    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:44.228138    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:44.229966    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:44.725021    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:44.725045    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:44.725058    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:44.725062    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:44.729238    3744 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:37:44.729727    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:44.729735    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:44.729741    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:44.729745    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:44.731433    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:44.731726    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:45.225302    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:45.225330    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:45.225341    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:45.225347    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:45.228863    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:45.229379    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:45.229389    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:45.229397    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:45.229401    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:45.231429    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:45.724243    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:45.724324    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:45.724337    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:45.724344    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:45.727683    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:45.728405    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:45.728413    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:45.728419    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:45.728422    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:45.730098    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:46.223716    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:46.223773    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:46.223788    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:46.223796    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:46.227605    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:46.228067    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:46.228076    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:46.228082    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:46.228085    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:46.229768    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:46.724565    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:46.724619    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:46.724633    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:46.724641    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:46.728150    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:46.728985    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:46.728992    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:46.728998    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:46.729001    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:46.730855    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:47.224578    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:47.224599    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:47.224612    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:47.224618    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:47.227578    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:47.228002    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:47.228009    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:47.228015    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:47.228018    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:47.229721    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:47.230041    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:47.724560    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:47.724585    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:47.724594    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:47.724599    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:47.728122    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:47.728734    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:47.728742    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:47.728748    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:47.728751    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:47.730435    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:48.223615    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:48.223629    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:48.223636    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:48.223640    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:48.226095    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:48.226577    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:48.226586    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:48.226591    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:48.226598    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:48.228415    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:48.724122    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:48.724142    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:48.724153    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:48.724160    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:48.727651    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:48.728172    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:48.728183    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:48.728191    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:48.728195    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:48.729902    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:49.223260    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:49.223281    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:49.223292    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:49.223298    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:49.226301    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:49.226932    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:49.226940    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:49.226945    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:49.226947    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:49.228480    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:49.724076    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:49.724109    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:49.724120    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:49.724127    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:49.727544    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:49.728275    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:49.728284    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:49.728290    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:49.728293    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:49.729994    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:49.730332    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:50.223448    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:50.223462    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:50.223471    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:50.223475    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:50.225685    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:50.226217    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:50.226225    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:50.226231    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:50.226242    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:50.228286    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:50.723871    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:50.723896    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:50.723910    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:50.723918    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:50.727053    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:50.728013    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:50.728021    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:50.728027    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:50.728033    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:50.729924    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:51.223394    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:51.223411    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:51.223419    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:51.223424    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:51.226019    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:51.226638    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:51.226646    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:51.226652    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:51.226662    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:51.228242    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:51.724305    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:51.724331    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:51.724341    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:51.724348    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:51.728121    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:51.728579    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:51.728588    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:51.728593    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:51.728603    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:51.730578    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:51.730868    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:52.223952    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:52.224012    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:52.224021    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:52.224025    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:52.226458    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:52.227072    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:52.227080    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:52.227087    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:52.227090    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:52.228719    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:52.724240    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:52.724287    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:52.724299    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:52.724308    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:52.727394    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:52.727827    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:52.727834    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:52.727840    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:52.727844    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:52.729417    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:53.224920    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:53.225020    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:53.225037    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:53.225045    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:53.228826    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:53.229364    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:53.229374    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:53.229380    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:53.229387    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:53.231081    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:53.723365    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:53.723381    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:53.723393    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:53.723397    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:53.725512    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:53.725934    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:53.725942    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:53.725948    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:53.725951    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:53.727517    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:54.223251    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:54.223290    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.223310    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.223318    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.225362    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.225778    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:54.225786    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.225792    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.225797    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.227316    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:54.227664    3744 pod_ready.go:103] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"False"
	I0831 15:37:54.723470    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:37:54.723553    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.723566    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.723572    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.726339    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.727040    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:54.727047    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.727053    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.727056    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.729195    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.729717    3744 pod_ready.go:93] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:54.729726    3744 pod_ready.go:82] duration metric: took 18.006599646s for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.729733    3744 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.729768    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-snq8s
	I0831 15:37:54.729773    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.729779    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.729782    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.731747    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:54.732348    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:54.732355    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.732364    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.732369    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.734207    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:54.734716    3744 pod_ready.go:93] pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:54.734725    3744 pod_ready.go:82] duration metric: took 4.986587ms for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.734738    3744 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.734775    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000
	I0831 15:37:54.734780    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.734785    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.734789    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.736900    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.737556    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:54.737563    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.737569    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.737573    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.739693    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.740047    3744 pod_ready.go:93] pod "etcd-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:54.740059    3744 pod_ready.go:82] duration metric: took 5.312281ms for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.740065    3744 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.740098    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m02
	I0831 15:37:54.740102    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.740108    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.740113    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.742355    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.742925    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:54.742933    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.742939    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.742944    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.744985    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.745483    3744 pod_ready.go:93] pod "etcd-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:54.745493    3744 pod_ready.go:82] duration metric: took 5.421796ms for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.745499    3744 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.745536    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m03
	I0831 15:37:54.745541    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.745547    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.745550    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.747563    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:54.748056    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:54.748063    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.748069    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.748071    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.749754    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:54.750027    3744 pod_ready.go:93] pod "etcd-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:54.750036    3744 pod_ready.go:82] duration metric: took 4.531272ms for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.750045    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:54.924527    3744 request.go:632] Waited for 174.448251ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:37:54.924561    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:37:54.924565    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:54.924570    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:54.924576    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:54.926540    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:55.124217    3744 request.go:632] Waited for 197.191409ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:55.124320    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:55.124331    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:55.124342    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:55.124349    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:55.127699    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:55.127979    3744 pod_ready.go:93] pod "kube-apiserver-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:55.127988    3744 pod_ready.go:82] duration metric: took 377.933462ms for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:55.127995    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:55.323995    3744 request.go:632] Waited for 195.947787ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:37:55.324122    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:37:55.324133    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:55.324142    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:55.324147    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:55.326536    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:55.524340    3744 request.go:632] Waited for 197.377407ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:55.524428    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:55.524437    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:55.524444    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:55.524458    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:55.527694    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:55.528065    3744 pod_ready.go:93] pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:55.528075    3744 pod_ready.go:82] duration metric: took 400.071053ms for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:55.528082    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:55.724069    3744 request.go:632] Waited for 195.89026ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:37:55.724147    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:37:55.724160    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:55.724178    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:55.724193    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:55.727264    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:55.924174    3744 request.go:632] Waited for 196.444661ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:55.924262    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:55.924273    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:55.924284    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:55.924290    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:55.927217    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:55.927667    3744 pod_ready.go:93] pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:55.927677    3744 pod_ready.go:82] duration metric: took 399.585518ms for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:55.927691    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:56.123773    3744 request.go:632] Waited for 195.997614ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:37:56.123824    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:37:56.123834    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:56.123859    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:56.123868    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:56.126826    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:56.323602    3744 request.go:632] Waited for 196.242245ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:56.323669    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:56.323713    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:56.323725    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:56.323736    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:56.326205    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:56.326487    3744 pod_ready.go:93] pod "kube-controller-manager-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:56.326497    3744 pod_ready.go:82] duration metric: took 398.79568ms for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:56.326504    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:56.525262    3744 request.go:632] Waited for 198.697997ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:56.525404    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:37:56.525415    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:56.525426    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:56.525435    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:56.528812    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:56.723576    3744 request.go:632] Waited for 194.289214ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:56.723635    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:56.723642    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:56.723648    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:56.723664    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:56.725655    3744 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:37:56.726101    3744 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:56.726110    3744 pod_ready.go:82] duration metric: took 399.596067ms for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:56.726117    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:56.923811    3744 request.go:632] Waited for 197.624636ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:37:56.923859    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:37:56.923866    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:56.923874    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:56.923879    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:56.926307    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:57.123874    3744 request.go:632] Waited for 197.165319ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:57.123948    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:57.123956    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:57.123964    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:57.123981    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:57.126673    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:57.127130    3744 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:57.127139    3744 pod_ready.go:82] duration metric: took 401.01276ms for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:57.127146    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:57.323575    3744 request.go:632] Waited for 196.38297ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:37:57.323627    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:37:57.323635    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:57.323646    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:57.323654    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:57.326792    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:57.524981    3744 request.go:632] Waited for 197.675488ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:57.525056    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:57.525064    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:57.525072    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:57.525077    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:57.527436    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:57.527834    3744 pod_ready.go:93] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:57.527844    3744 pod_ready.go:82] duration metric: took 400.687607ms for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:57.527851    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:57.724761    3744 request.go:632] Waited for 196.867729ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-d45q5
	I0831 15:37:57.724843    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-d45q5
	I0831 15:37:57.724852    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:57.724860    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:57.724864    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:57.727338    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:57.924277    3744 request.go:632] Waited for 196.366483ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:57.924352    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:57.924361    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:57.924369    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:57.924376    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:57.926744    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:57.927036    3744 pod_ready.go:93] pod "kube-proxy-d45q5" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:57.927045    3744 pod_ready.go:82] duration metric: took 399.185058ms for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:57.927052    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:58.123932    3744 request.go:632] Waited for 196.831846ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:58.124040    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:37:58.124050    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:58.124062    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:58.124067    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:58.127075    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:58.323899    3744 request.go:632] Waited for 196.438465ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:58.323934    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:58.323939    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:58.323946    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:58.323982    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:58.326076    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:58.326347    3744 pod_ready.go:93] pod "kube-proxy-q7ndn" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:58.326358    3744 pod_ready.go:82] duration metric: took 399.29367ms for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:58.326365    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:58.524333    3744 request.go:632] Waited for 197.864558ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:37:58.524448    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:37:58.524460    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:58.524471    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:58.524478    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:58.527937    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:58.724668    3744 request.go:632] Waited for 196.043209ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:58.724763    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:37:58.724780    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:58.724797    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:58.724815    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:58.727732    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:58.728090    3744 pod_ready.go:93] pod "kube-scheduler-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:58.728099    3744 pod_ready.go:82] duration metric: took 401.725065ms for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:58.728105    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:58.925170    3744 request.go:632] Waited for 197.0037ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:37:58.925325    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:37:58.925339    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:58.925351    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:58.925358    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:58.928967    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:59.124043    3744 request.go:632] Waited for 194.666869ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:59.124133    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:37:59.124143    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:59.124154    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:59.124161    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:59.127137    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:59.127523    3744 pod_ready.go:93] pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:59.127532    3744 pod_ready.go:82] duration metric: took 399.417767ms for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:59.127541    3744 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:59.324020    3744 request.go:632] Waited for 196.418346ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:37:59.324169    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:37:59.324180    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:59.324191    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:59.324200    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:59.327657    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:59.523961    3744 request.go:632] Waited for 195.650623ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:59.524073    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:37:59.524086    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:59.524097    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:59.524105    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:59.527091    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:37:59.527542    3744 pod_ready.go:93] pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace has status "Ready":"True"
	I0831 15:37:59.527550    3744 pod_ready.go:82] duration metric: took 399.999976ms for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:37:59.527558    3744 pod_ready.go:39] duration metric: took 22.814715363s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:37:59.527569    3744 api_server.go:52] waiting for apiserver process to appear ...
	I0831 15:37:59.527620    3744 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:37:59.540037    3744 api_server.go:72] duration metric: took 23.068203242s to wait for apiserver process to appear ...
	I0831 15:37:59.540049    3744 api_server.go:88] waiting for apiserver healthz status ...
	I0831 15:37:59.540059    3744 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0831 15:37:59.543113    3744 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0831 15:37:59.543146    3744 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0831 15:37:59.543150    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:59.543156    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:59.543161    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:59.543866    3744 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 15:37:59.543927    3744 api_server.go:141] control plane version: v1.31.0
	I0831 15:37:59.543936    3744 api_server.go:131] duration metric: took 3.882759ms to wait for apiserver health ...
	I0831 15:37:59.543942    3744 system_pods.go:43] waiting for kube-system pods to appear ...
	I0831 15:37:59.723587    3744 request.go:632] Waited for 179.596374ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:37:59.723694    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:37:59.723708    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:59.723718    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:59.723734    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:59.728359    3744 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:37:59.733656    3744 system_pods.go:59] 24 kube-system pods found
	I0831 15:37:59.733668    3744 system_pods.go:61] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:37:59.733672    3744 system_pods.go:61] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:37:59.733676    3744 system_pods.go:61] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:37:59.733679    3744 system_pods.go:61] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:37:59.733681    3744 system_pods.go:61] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:37:59.733684    3744 system_pods.go:61] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:37:59.733686    3744 system_pods.go:61] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:37:59.733689    3744 system_pods.go:61] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:37:59.733691    3744 system_pods.go:61] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:37:59.733694    3744 system_pods.go:61] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:37:59.733696    3744 system_pods.go:61] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:37:59.733699    3744 system_pods.go:61] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:37:59.733702    3744 system_pods.go:61] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:37:59.733705    3744 system_pods.go:61] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:37:59.733708    3744 system_pods.go:61] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:37:59.733710    3744 system_pods.go:61] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:37:59.733714    3744 system_pods.go:61] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:37:59.733718    3744 system_pods.go:61] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:37:59.733721    3744 system_pods.go:61] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:37:59.733724    3744 system_pods.go:61] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:37:59.733726    3744 system_pods.go:61] "kube-vip-ha-949000" [98967a2c-6641-4193-b7ce-c0fbdee58344] Running
	I0831 15:37:59.733729    3744 system_pods.go:61] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:37:59.733731    3744 system_pods.go:61] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:37:59.733734    3744 system_pods.go:61] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:37:59.733738    3744 system_pods.go:74] duration metric: took 189.789494ms to wait for pod list to return data ...
	I0831 15:37:59.733743    3744 default_sa.go:34] waiting for default service account to be created ...
	I0831 15:37:59.923784    3744 request.go:632] Waited for 189.987121ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:37:59.923870    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:37:59.923881    3744 round_trippers.go:469] Request Headers:
	I0831 15:37:59.923893    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:37:59.923900    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:37:59.927288    3744 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:37:59.927352    3744 default_sa.go:45] found service account: "default"
	I0831 15:37:59.927361    3744 default_sa.go:55] duration metric: took 193.611323ms for default service account to be created ...
	I0831 15:37:59.927366    3744 system_pods.go:116] waiting for k8s-apps to be running ...
	I0831 15:38:00.124803    3744 request.go:632] Waited for 197.388029ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:38:00.124898    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:38:00.124909    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:00.124920    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:00.124939    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:00.129956    3744 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0831 15:38:00.134973    3744 system_pods.go:86] 24 kube-system pods found
	I0831 15:38:00.134985    3744 system_pods.go:89] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:38:00.134989    3744 system_pods.go:89] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:38:00.134993    3744 system_pods.go:89] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:38:00.134996    3744 system_pods.go:89] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:38:00.134999    3744 system_pods.go:89] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:38:00.135002    3744 system_pods.go:89] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:38:00.135005    3744 system_pods.go:89] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:38:00.135008    3744 system_pods.go:89] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:38:00.135011    3744 system_pods.go:89] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:38:00.135013    3744 system_pods.go:89] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:38:00.135017    3744 system_pods.go:89] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:38:00.135019    3744 system_pods.go:89] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:38:00.135025    3744 system_pods.go:89] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:38:00.135028    3744 system_pods.go:89] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:38:00.135031    3744 system_pods.go:89] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:38:00.135034    3744 system_pods.go:89] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:38:00.135037    3744 system_pods.go:89] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:38:00.135039    3744 system_pods.go:89] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:38:00.135042    3744 system_pods.go:89] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:38:00.135045    3744 system_pods.go:89] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:38:00.135049    3744 system_pods.go:89] "kube-vip-ha-949000" [98967a2c-6641-4193-b7ce-c0fbdee58344] Running
	I0831 15:38:00.135051    3744 system_pods.go:89] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:38:00.135056    3744 system_pods.go:89] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:38:00.135060    3744 system_pods.go:89] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:38:00.135065    3744 system_pods.go:126] duration metric: took 207.692433ms to wait for k8s-apps to be running ...
	I0831 15:38:00.135070    3744 system_svc.go:44] waiting for kubelet service to be running ....
	I0831 15:38:00.135137    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:38:00.146618    3744 system_svc.go:56] duration metric: took 11.54297ms WaitForService to wait for kubelet
	I0831 15:38:00.146633    3744 kubeadm.go:582] duration metric: took 23.674794454s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:38:00.146650    3744 node_conditions.go:102] verifying NodePressure condition ...
	I0831 15:38:00.324468    3744 request.go:632] Waited for 177.772827ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0831 15:38:00.324541    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0831 15:38:00.324549    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:00.324557    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:00.324561    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:00.326804    3744 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:38:00.327655    3744 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:38:00.327666    3744 node_conditions.go:123] node cpu capacity is 2
	I0831 15:38:00.327673    3744 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:38:00.327677    3744 node_conditions.go:123] node cpu capacity is 2
	I0831 15:38:00.327680    3744 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:38:00.327683    3744 node_conditions.go:123] node cpu capacity is 2
	I0831 15:38:00.327689    3744 node_conditions.go:105] duration metric: took 181.029342ms to run NodePressure ...
	I0831 15:38:00.327697    3744 start.go:241] waiting for startup goroutines ...
	I0831 15:38:00.327709    3744 start.go:255] writing updated cluster config ...
	I0831 15:38:00.348472    3744 out.go:201] 
	I0831 15:38:00.369311    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:38:00.369379    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:38:00.391565    3744 out.go:177] * Starting "ha-949000-m04" worker node in "ha-949000" cluster
	I0831 15:38:00.433358    3744 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:38:00.433417    3744 cache.go:56] Caching tarball of preloaded images
	I0831 15:38:00.433601    3744 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:38:00.433620    3744 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:38:00.433752    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:38:00.434936    3744 start.go:360] acquireMachinesLock for ha-949000-m04: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:38:00.435036    3744 start.go:364] duration metric: took 76.344µs to acquireMachinesLock for "ha-949000-m04"
	I0831 15:38:00.435061    3744 start.go:96] Skipping create...Using existing machine configuration
	I0831 15:38:00.435070    3744 fix.go:54] fixHost starting: m04
	I0831 15:38:00.435494    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:38:00.435519    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:38:00.444781    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51881
	I0831 15:38:00.445158    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:38:00.445521    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:38:00.445531    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:38:00.445763    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:38:00.445892    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:00.445989    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetState
	I0831 15:38:00.446076    3744 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:38:00.446156    3744 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3377
	I0831 15:38:00.447072    3744 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid 3377 missing from process table
	I0831 15:38:00.447102    3744 fix.go:112] recreateIfNeeded on ha-949000-m04: state=Stopped err=<nil>
	I0831 15:38:00.447112    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	W0831 15:38:00.447197    3744 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 15:38:00.468433    3744 out.go:177] * Restarting existing hyperkit VM for "ha-949000-m04" ...
	I0831 15:38:00.542198    3744 main.go:141] libmachine: (ha-949000-m04) Calling .Start
	I0831 15:38:00.542515    3744 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:38:00.542650    3744 main.go:141] libmachine: (ha-949000-m04) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid
	I0831 15:38:00.544312    3744 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid 3377 missing from process table
	I0831 15:38:00.544344    3744 main.go:141] libmachine: (ha-949000-m04) DBG | pid 3377 is in state "Stopped"
	I0831 15:38:00.544372    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid...
	I0831 15:38:00.544580    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Using UUID 5ee34770-2239-4427-9789-bd204fe095a6
	I0831 15:38:00.571913    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Generated MAC 8a:3c:61:5f:c5:84
	I0831 15:38:00.571940    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:38:00.572058    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"5ee34770-2239-4427-9789-bd204fe095a6", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003bec00)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:38:00.572092    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"5ee34770-2239-4427-9789-bd204fe095a6", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003bec00)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:38:00.572124    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "5ee34770-2239-4427-9789-bd204fe095a6", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/ha-949000-m04.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m04/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:38:00.572235    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 5ee34770-2239-4427-9789-bd204fe095a6 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/ha-949000-m04.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:38:00.572259    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:38:00.573709    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 DEBUG: hyperkit: Pid is 3806
	I0831 15:38:00.574064    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Attempt 0
	I0831 15:38:00.574112    3744 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:38:00.574129    3744 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3806
	I0831 15:38:00.576177    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Searching for 8a:3c:61:5f:c5:84 in /var/db/dhcpd_leases ...
	I0831 15:38:00.576262    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0831 15:38:00.576305    3744 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d4eca7}
	I0831 15:38:00.576319    3744 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ec75}
	I0831 15:38:00.576335    3744 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4ec63}
	I0831 15:38:00.576351    3744 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d4eb85}
	I0831 15:38:00.576382    3744 main.go:141] libmachine: (ha-949000-m04) DBG | Found match: 8a:3c:61:5f:c5:84
	I0831 15:38:00.576399    3744 main.go:141] libmachine: (ha-949000-m04) DBG | IP: 192.169.0.8
	I0831 15:38:00.576410    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetConfigRaw
	I0831 15:38:00.577215    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:38:00.577389    3744 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:38:00.577864    3744 machine.go:93] provisionDockerMachine start ...
	I0831 15:38:00.577878    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:00.578009    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:00.578108    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:00.578212    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:00.578342    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:00.578431    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:00.578558    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:38:00.578712    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:38:00.578720    3744 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 15:38:00.582294    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:38:00.590710    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:38:00.591705    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:38:00.591723    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:38:00.591734    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:38:00.591743    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:38:00.976655    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:38:00.976695    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:00 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:38:01.091423    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:01 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:38:01.091445    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:01 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:38:01.091527    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:01 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:38:01.091554    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:01 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:38:01.092272    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:01 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:38:01.092283    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:01 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:38:06.721349    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:06 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:38:06.721473    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:06 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:38:06.721482    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:06 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:38:06.745779    3744 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:38:06 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:38:11.647284    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 15:38:11.647298    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetMachineName
	I0831 15:38:11.647457    3744 buildroot.go:166] provisioning hostname "ha-949000-m04"
	I0831 15:38:11.647468    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetMachineName
	I0831 15:38:11.647566    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:11.647657    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:11.647737    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:11.647830    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:11.647929    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:11.648056    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:38:11.648211    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:38:11.648224    3744 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m04 && echo "ha-949000-m04" | sudo tee /etc/hostname
	I0831 15:38:11.720881    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m04
	
	I0831 15:38:11.720895    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:11.721030    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:11.721141    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:11.721229    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:11.721323    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:11.721445    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:38:11.721583    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:38:11.721594    3744 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m04' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m04/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m04' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:38:11.787551    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:38:11.787565    3744 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:38:11.787574    3744 buildroot.go:174] setting up certificates
	I0831 15:38:11.787580    3744 provision.go:84] configureAuth start
	I0831 15:38:11.787586    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetMachineName
	I0831 15:38:11.787717    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:38:11.787807    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:11.787897    3744 provision.go:143] copyHostCerts
	I0831 15:38:11.787923    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:38:11.787987    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:38:11.787993    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:38:11.788130    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:38:11.788325    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:38:11.788370    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:38:11.788375    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:38:11.788450    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:38:11.788631    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:38:11.788686    3744 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:38:11.788692    3744 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:38:11.788777    3744 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:38:11.788936    3744 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m04 san=[127.0.0.1 192.169.0.8 ha-949000-m04 localhost minikube]
	I0831 15:38:11.923616    3744 provision.go:177] copyRemoteCerts
	I0831 15:38:11.923670    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:38:11.923684    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:11.923822    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:11.923908    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:11.924002    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:11.924089    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:38:11.965052    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:38:11.965128    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:38:11.989075    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:38:11.989152    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:38:12.008938    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:38:12.009008    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0831 15:38:12.028923    3744 provision.go:87] duration metric: took 241.333371ms to configureAuth
	I0831 15:38:12.028939    3744 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:38:12.029131    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:38:12.029146    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:12.029282    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:12.029361    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:12.029448    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:12.029527    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:12.029620    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:12.029746    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:38:12.029867    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:38:12.029874    3744 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:38:12.090450    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:38:12.090463    3744 buildroot.go:70] root file system type: tmpfs
	I0831 15:38:12.090535    3744 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:38:12.090548    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:12.090681    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:12.090786    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:12.090898    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:12.091016    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:12.091186    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:38:12.091325    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:38:12.091371    3744 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:38:12.161741    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	Environment=NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:38:12.161767    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:12.161902    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:12.161995    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:12.162101    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:12.162204    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:12.162325    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:38:12.162467    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:38:12.162479    3744 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:38:13.717080    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:38:13.717094    3744 machine.go:96] duration metric: took 13.139081069s to provisionDockerMachine
	I0831 15:38:13.717101    3744 start.go:293] postStartSetup for "ha-949000-m04" (driver="hyperkit")
	I0831 15:38:13.717109    3744 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:38:13.717123    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:13.717308    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:38:13.717321    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:13.717411    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:13.717514    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:13.717598    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:13.717686    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:38:13.753970    3744 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:38:13.757041    3744 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:38:13.757049    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:38:13.757147    3744 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:38:13.757317    3744 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:38:13.757323    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:38:13.757520    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:38:13.764743    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:38:13.784545    3744 start.go:296] duration metric: took 67.430377ms for postStartSetup
	I0831 15:38:13.784594    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:13.784782    3744 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0831 15:38:13.784795    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:13.784891    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:13.784980    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:13.785074    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:13.785157    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:38:13.822419    3744 machine.go:197] restoring vm config from /var/lib/minikube/backup: [etc]
	I0831 15:38:13.822478    3744 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0831 15:38:13.856251    3744 fix.go:56] duration metric: took 13.421034183s for fixHost
	I0831 15:38:13.856276    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:13.856412    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:13.856504    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:13.856591    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:13.856670    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:13.856794    3744 main.go:141] libmachine: Using SSH client type: native
	I0831 15:38:13.856933    3744 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xe5a7ea0] 0xe5aac00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:38:13.856940    3744 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:38:13.917606    3744 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725143893.981325007
	
	I0831 15:38:13.917619    3744 fix.go:216] guest clock: 1725143893.981325007
	I0831 15:38:13.917634    3744 fix.go:229] Guest: 2024-08-31 15:38:13.981325007 -0700 PDT Remote: 2024-08-31 15:38:13.856265 -0700 PDT m=+124.128653576 (delta=125.060007ms)
	I0831 15:38:13.917650    3744 fix.go:200] guest clock delta is within tolerance: 125.060007ms
	I0831 15:38:13.917655    3744 start.go:83] releasing machines lock for "ha-949000-m04", held for 13.482464465s
	I0831 15:38:13.917676    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:13.917802    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:38:13.942019    3744 out.go:177] * Found network options:
	I0831 15:38:13.963076    3744 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	W0831 15:38:13.984049    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:38:13.984067    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:38:13.984075    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:38:13.984086    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:13.984514    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:13.984633    3744 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:38:13.984692    3744 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:38:13.984722    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	W0831 15:38:13.984773    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:38:13.984786    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:38:13.984810    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	W0831 15:38:13.984809    3744 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:38:13.984873    3744 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:38:13.984894    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:38:13.984907    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:13.984995    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:38:13.985009    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:13.985085    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:38:13.985105    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:38:13.985186    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:38:13.985271    3744 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	W0831 15:38:14.024342    3744 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:38:14.024407    3744 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:38:14.067158    3744 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:38:14.067173    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:38:14.067244    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:38:14.082520    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:38:14.090779    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:38:14.099040    3744 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:38:14.099091    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:38:14.107242    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:38:14.115660    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:38:14.124011    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:38:14.132309    3744 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:38:14.140696    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:38:14.149089    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:38:14.157409    3744 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:38:14.165662    3744 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:38:14.173102    3744 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:38:14.180728    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:38:14.276483    3744 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:38:14.296705    3744 start.go:495] detecting cgroup driver to use...
	I0831 15:38:14.296785    3744 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:38:14.312751    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:38:14.325397    3744 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:38:14.342774    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:38:14.353024    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:38:14.363251    3744 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:38:14.380028    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:38:14.390424    3744 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:38:14.405244    3744 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:38:14.408231    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:38:14.415934    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:38:14.429648    3744 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:38:14.529094    3744 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:38:14.646662    3744 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:38:14.646690    3744 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:38:14.660870    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:38:14.760474    3744 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:38:17.038586    3744 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.278065529s)
	I0831 15:38:17.038650    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:38:17.049008    3744 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:38:17.062620    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:38:17.073607    3744 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:38:17.168850    3744 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:38:17.269764    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:38:17.377489    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:38:17.390666    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:38:17.402072    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:38:17.507294    3744 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:38:17.568987    3744 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:38:17.569066    3744 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:38:17.574853    3744 start.go:563] Will wait 60s for crictl version
	I0831 15:38:17.574909    3744 ssh_runner.go:195] Run: which crictl
	I0831 15:38:17.578814    3744 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:38:17.605368    3744 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:38:17.605446    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:38:17.624343    3744 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:38:17.679051    3744 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:38:17.753456    3744 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:38:17.812386    3744 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0831 15:38:17.902651    3744 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	I0831 15:38:17.924439    3744 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:38:17.924700    3744 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:38:17.928251    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:38:17.938446    3744 mustload.go:65] Loading cluster: ha-949000
	I0831 15:38:17.938620    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:38:17.938850    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:38:17.938873    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:38:17.947622    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51903
	I0831 15:38:17.948032    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:38:17.948446    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:38:17.948460    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:38:17.948674    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:38:17.948791    3744 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:38:17.948881    3744 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:38:17.948987    3744 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 3756
	I0831 15:38:17.950000    3744 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:38:17.950260    3744 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:38:17.950294    3744 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:38:17.959428    3744 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51905
	I0831 15:38:17.959777    3744 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:38:17.960145    3744 main.go:141] libmachine: Using API Version  1
	I0831 15:38:17.960162    3744 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:38:17.960360    3744 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:38:17.960471    3744 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:38:17.960562    3744 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.8
	I0831 15:38:17.960568    3744 certs.go:194] generating shared ca certs ...
	I0831 15:38:17.960576    3744 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:38:17.960771    3744 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:38:17.960844    3744 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:38:17.960854    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:38:17.960878    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:38:17.960897    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:38:17.960914    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:38:17.961001    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:38:17.961051    3744 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:38:17.961060    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:38:17.961098    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:38:17.961130    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:38:17.961166    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:38:17.961235    3744 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:38:17.961269    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:38:17.961290    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:38:17.961312    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:38:17.961342    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:38:17.980971    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:38:18.000269    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:38:18.019936    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:38:18.039774    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:38:18.059357    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:38:18.078502    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:38:18.097967    3744 ssh_runner.go:195] Run: openssl version
	I0831 15:38:18.102444    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:38:18.111969    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:38:18.115584    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:38:18.115639    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:38:18.119889    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:38:18.129130    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:38:18.138067    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:38:18.141420    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:38:18.141464    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:38:18.145592    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:38:18.154725    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:38:18.163859    3744 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:38:18.167695    3744 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:38:18.167749    3744 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:38:18.172178    3744 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:38:18.181412    3744 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:38:18.184441    3744 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 15:38:18.184478    3744 kubeadm.go:934] updating node {m04 192.169.0.8 0 v1.31.0  false true} ...
	I0831 15:38:18.184543    3744 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m04 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.8
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:38:18.184588    3744 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:38:18.192672    3744 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.31.0: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.31.0': No such file or directory
	
	Initiating transfer...
	I0831 15:38:18.192722    3744 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.31.0
	I0831 15:38:18.201203    3744 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubeadm.sha256
	I0831 15:38:18.201203    3744 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubectl.sha256
	I0831 15:38:18.201205    3744 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.31.0/bin/linux/amd64/kubelet.sha256
	I0831 15:38:18.201219    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm -> /var/lib/minikube/binaries/v1.31.0/kubeadm
	I0831 15:38:18.201219    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl -> /var/lib/minikube/binaries/v1.31.0/kubectl
	I0831 15:38:18.201260    3744 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:38:18.201327    3744 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubeadm
	I0831 15:38:18.201327    3744 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubectl
	I0831 15:38:18.213304    3744 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubeadm: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubeadm': No such file or directory
	I0831 15:38:18.213305    3744 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubectl: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubectl': No such file or directory
	I0831 15:38:18.213306    3744 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet -> /var/lib/minikube/binaries/v1.31.0/kubelet
	I0831 15:38:18.213339    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubectl --> /var/lib/minikube/binaries/v1.31.0/kubectl (56381592 bytes)
	I0831 15:38:18.213339    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubeadm --> /var/lib/minikube/binaries/v1.31.0/kubeadm (58290328 bytes)
	I0831 15:38:18.213434    3744 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubelet
	I0831 15:38:18.234959    3744 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.31.0/kubelet: stat -c "%s %y" /var/lib/minikube/binaries/v1.31.0/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.31.0/kubelet': No such file or directory
	I0831 15:38:18.235000    3744 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/cache/linux/amd64/v1.31.0/kubelet --> /var/lib/minikube/binaries/v1.31.0/kubelet (76865848 bytes)
	I0831 15:38:18.870025    3744 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system
	I0831 15:38:18.878175    3744 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:38:18.892204    3744 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:38:18.906289    3744 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:38:18.909279    3744 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:38:18.919652    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:38:19.014285    3744 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:38:19.030068    3744 start.go:235] Will wait 6m0s for node &{Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime: ControlPlane:false Worker:true}
	I0831 15:38:19.030257    3744 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:38:19.052807    3744 out.go:177] * Verifying Kubernetes components...
	I0831 15:38:19.073469    3744 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:38:19.170855    3744 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:38:19.775316    3744 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:38:19.775538    3744 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0xfc63c00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:38:19.775580    3744 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:38:19.775737    3744 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m04" to be "Ready" ...
	I0831 15:38:19.775777    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:19.775783    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:19.775789    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:19.775793    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:19.778097    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:20.276562    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:20.276584    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:20.276613    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:20.276621    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:20.279146    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:20.777079    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:20.777090    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:20.777097    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:20.777101    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:20.779128    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:21.277261    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:21.277277    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:21.277283    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:21.277286    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:21.279452    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:21.776272    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:21.776285    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:21.776292    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:21.776295    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:21.778482    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:21.778547    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:22.276209    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:22.276224    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:22.276233    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:22.276239    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:22.278431    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:22.775916    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:22.775932    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:22.775939    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:22.775943    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:22.778178    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:23.276360    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:23.276381    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:23.276392    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:23.276398    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:23.279406    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:23.775977    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:23.775995    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:23.776032    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:23.776037    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:23.778193    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:24.277072    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:24.277087    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:24.277093    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:24.277097    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:24.279300    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:24.279384    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:24.777071    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:24.777083    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:24.777089    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:24.777093    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:24.779084    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:38:25.277302    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:25.277326    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:25.277343    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:25.277370    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:25.280739    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:25.777360    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:25.777375    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:25.777382    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:25.777386    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:25.779584    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:26.277703    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:26.277720    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:26.277728    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:26.277739    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:26.279789    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:26.279858    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:26.776231    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:26.776272    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:26.776280    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:26.776285    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:26.778315    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:27.276174    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:27.276188    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:27.276194    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:27.276197    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:27.278437    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:27.776689    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:27.776708    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:27.776717    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:27.776721    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:27.779053    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:28.276081    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:28.276100    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:28.276111    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:28.276117    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:28.279235    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:28.776709    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:28.776722    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:28.776728    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:28.776732    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:28.778876    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:28.778948    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:29.276276    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:29.276292    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:29.276300    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:29.276306    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:29.278917    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:29.776120    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:29.776137    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:29.776147    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:29.776153    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:29.778926    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:30.277099    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:30.277114    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:30.277119    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:30.277121    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:30.279209    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:30.776289    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:30.776306    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:30.776318    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:30.776325    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:30.778950    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:30.779042    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:31.277113    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:31.277129    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:31.277137    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:31.277142    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:31.279308    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:31.776871    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:31.776885    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:31.776892    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:31.776907    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:31.779110    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:32.276639    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:32.276666    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:32.276677    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:32.276709    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:32.279642    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:32.776964    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:32.777005    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:32.777013    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:32.777017    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:32.778916    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:38:33.276097    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:33.276113    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:33.276120    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:33.276123    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:33.278201    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:33.278323    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:33.778025    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:33.778051    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:33.778062    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:33.778067    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:33.781122    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:34.277596    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:34.277611    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:34.277617    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:34.277620    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:34.279507    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:38:34.776042    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:34.776055    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:34.776061    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:34.776064    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:34.778134    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:35.276180    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:35.276203    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:35.276281    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:35.276292    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:35.279248    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:35.279324    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:35.776557    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:35.776577    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:35.776588    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:35.776595    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:35.779906    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:36.276525    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:36.276541    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:36.276547    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:36.276550    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:36.278734    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:36.776417    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:36.776489    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:36.776512    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:36.776522    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:36.779524    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:37.277720    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:37.277733    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:37.277739    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:37.277743    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:37.279925    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:37.279984    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:37.777252    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:37.777269    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:37.777274    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:37.777277    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:37.779271    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:38:38.278156    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:38.278211    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:38.278223    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:38.278229    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:38.280712    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:38.776178    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:38.776203    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:38.776213    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:38.776219    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:38.779093    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:39.276872    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:39.276885    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:39.276892    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:39.276896    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:39.279063    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:39.776884    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:39.776898    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:39.776905    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:39.776909    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:39.779259    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:39.779362    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:40.277202    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:40.277230    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:40.277242    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:40.277249    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:40.280416    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:40.776384    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:40.776396    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:40.776403    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:40.776406    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:40.778591    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:41.276444    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:41.276465    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:41.276477    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:41.276482    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:41.279236    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:41.777547    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:41.777633    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:41.777648    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:41.777658    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:41.780834    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:41.780914    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:42.276626    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:42.276639    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:42.276646    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:42.276649    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:42.278771    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:42.777502    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:42.777527    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:42.777539    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:42.777544    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:42.780668    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:43.277508    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:43.277532    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:43.277544    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:43.277551    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:43.281198    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:43.777290    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:43.777306    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:43.777313    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:43.777316    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:43.779556    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:44.277098    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:44.277133    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:44.277144    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:44.277150    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:44.280482    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:44.280570    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:44.776182    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:44.776196    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:44.776204    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:44.776210    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:44.778630    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:45.276509    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:45.276522    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:45.276528    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:45.276540    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:45.278778    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:45.776791    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:45.776866    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:45.776879    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:45.776888    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:45.779812    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:46.277629    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:46.277650    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:46.277661    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:46.277669    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:46.280694    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:46.280771    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:46.776617    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:46.776632    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:46.776639    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:46.776644    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:46.778705    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:47.276853    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:47.276872    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:47.276881    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:47.276886    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:47.279224    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:47.777691    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:47.777716    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:47.777764    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:47.777772    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:47.780764    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:48.276263    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:48.276280    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:48.276286    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:48.276289    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:48.278387    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:48.776798    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:48.776866    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:48.776876    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:48.776880    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:48.779266    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:48.779328    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:49.277706    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:49.277731    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:49.277798    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:49.277809    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:49.280441    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:49.776295    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:49.776306    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:49.776312    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:49.776320    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:49.778554    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:50.278315    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:50.278338    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:50.278403    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:50.278414    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:50.281533    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:50.777763    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:50.777778    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:50.777787    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:50.777796    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:50.780173    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:50.780239    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:51.276316    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:51.276332    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:51.276338    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:51.276342    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:51.278631    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:51.776296    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:51.776316    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:51.776325    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:51.776330    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:51.778726    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:52.276790    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:52.276847    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:52.276864    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:52.276870    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:52.279948    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:52.777099    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:52.777115    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:52.777121    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:52.777126    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:52.779325    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:53.276819    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:53.276881    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:53.276895    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:53.276904    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:53.279807    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:53.279883    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:53.776517    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:53.776532    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:53.776539    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:53.776543    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:53.778686    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:54.276276    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:54.276289    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:54.276299    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:54.276302    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:54.278627    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:54.777871    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:54.777890    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:54.777900    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:54.777906    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:54.781132    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:55.276882    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:55.276901    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:55.276913    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:55.276919    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:55.280226    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:55.280299    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:55.777001    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:55.777014    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:55.777020    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:55.777023    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:55.779025    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:38:56.277691    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:56.277714    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:56.277726    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:56.277731    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:56.280819    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:56.778188    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:56.778247    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:56.778257    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:56.778262    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:56.780685    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:57.276330    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:57.276344    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:57.276350    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:57.276354    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:57.278527    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:57.776849    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:57.776867    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:57.776875    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:57.776880    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:57.779132    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:57.779222    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:38:58.276676    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:58.276715    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:58.276723    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:58.276727    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:58.278722    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:38:58.776823    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:58.776841    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:58.776847    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:58.776851    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:58.779004    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:38:59.277009    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:59.277030    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:59.277041    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:59.277049    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:59.280147    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:59.776972    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:38:59.776990    3744 round_trippers.go:469] Request Headers:
	I0831 15:38:59.776999    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:38:59.777007    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:38:59.780392    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:38:59.780554    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:00.278237    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:00.278268    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:00.278275    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:00.278279    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:00.280339    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:00.776782    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:00.776803    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:00.776814    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:00.776819    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:00.780040    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:01.276687    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:01.276709    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:01.276717    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:01.276722    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:01.279213    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:01.776982    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:01.776997    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:01.777004    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:01.777008    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:01.779255    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:02.278179    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:02.278239    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:02.278253    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:02.278261    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:02.281537    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:02.281611    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:02.776749    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:02.776775    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:02.776786    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:02.776793    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:02.780000    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:03.278084    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:03.278100    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:03.278108    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:03.278112    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:03.280525    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:03.776918    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:03.776956    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:03.777002    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:03.777009    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:03.780638    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:04.278461    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:04.278485    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:04.278497    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:04.278502    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:04.281718    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:04.281791    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:04.776376    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:04.776389    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:04.776395    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:04.776398    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:04.778509    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:05.276848    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:05.276872    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:05.276883    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:05.276889    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:05.279765    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:05.776397    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:05.776423    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:05.776433    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:05.776439    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:05.779793    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:06.277954    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:06.277969    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:06.277977    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:06.277981    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:06.280446    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:06.777008    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:06.777039    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:06.777100    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:06.777109    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:06.780058    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:06.780166    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:07.277181    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:07.277203    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:07.277217    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:07.277222    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:07.280356    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:07.777895    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:07.777940    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:07.777952    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:07.777957    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:07.780087    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:08.276718    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:08.276745    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:08.276757    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:08.276763    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:08.279711    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:08.777099    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:08.777121    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:08.777132    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:08.777137    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:08.780212    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:08.780293    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:09.277158    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:09.277177    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:09.277183    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:09.277188    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:09.279328    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:09.776613    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:09.776624    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:09.776630    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:09.776635    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:09.778784    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:10.276643    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:10.276662    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:10.276674    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:10.276682    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:10.279706    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:10.776484    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:10.776495    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:10.776501    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:10.776504    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:10.778860    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:11.276917    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:11.276968    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:11.276981    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:11.276990    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:11.280015    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:11.280097    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:11.777726    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:11.777745    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:11.777753    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:11.777758    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:11.780176    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:12.278046    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:12.278058    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:12.278063    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:12.278067    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:12.280005    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:12.777919    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:12.777945    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:12.777992    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:12.777997    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:12.780507    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:13.278486    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:13.278543    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:13.278554    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:13.278559    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:13.281627    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:13.281745    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:13.776833    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:13.776854    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:13.776862    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:13.776866    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:13.779535    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:14.276922    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:14.276946    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:14.276958    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:14.276966    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:14.280174    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:14.776595    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:14.776617    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:14.776629    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:14.776634    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:14.779819    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:15.278222    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:15.278239    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:15.278247    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:15.278251    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:15.280553    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:15.776940    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:15.776965    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:15.776977    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:15.776983    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:15.780495    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:15.780576    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:16.276617    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:16.276642    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:16.276652    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:16.276656    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:16.279277    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:16.776588    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:16.776609    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:16.776618    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:16.776622    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:16.778820    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:17.277548    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:17.277568    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:17.277581    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:17.277590    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:17.280937    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:17.776562    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:17.776588    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:17.776600    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:17.776607    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:17.780085    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:18.277015    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:18.277036    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:18.277048    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:18.277056    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:18.280239    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:18.280318    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:18.777930    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:18.777950    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:18.777961    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:18.777968    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:18.781102    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:19.278645    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:19.278671    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:19.278683    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:19.278689    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:19.282270    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:19.778214    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:19.778225    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:19.778230    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:19.778234    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:19.780140    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:20.277051    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:20.277079    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:20.277090    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:20.277098    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:20.280382    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:20.280543    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:20.776682    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:20.776706    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:20.776719    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:20.776724    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:20.780231    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:21.278070    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:21.278085    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:21.278092    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:21.278096    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:21.280488    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:21.776700    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:21.776723    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:21.776735    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:21.776743    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:21.779589    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:22.276945    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:22.276985    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:22.276996    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:22.277001    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:22.279378    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:22.777198    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:22.777217    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:22.777226    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:22.777230    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:22.779837    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:22.779899    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:23.277517    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:23.277532    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:23.277540    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:23.277546    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:23.280021    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:23.776652    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:23.776672    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:23.776680    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:23.776685    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:23.779129    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:24.277535    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:24.277618    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:24.277631    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:24.277637    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:24.280844    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:24.776736    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:24.776755    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:24.776767    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:24.776774    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:24.779817    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:25.277529    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:25.277549    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:25.277560    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:25.277564    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:25.280343    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:25.280414    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:25.777390    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:25.777407    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:25.777415    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:25.777419    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:25.779809    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:26.277450    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:26.277472    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:26.277485    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:26.277492    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:26.279869    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:26.776900    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:26.776921    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:26.776929    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:26.776934    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:26.779045    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:27.277440    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:27.277457    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:27.277463    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:27.277467    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:27.279629    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:27.776631    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:27.776647    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:27.776655    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:27.776660    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:27.779236    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:27.779329    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:28.276659    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:28.276685    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:28.276697    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:28.276704    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:28.279990    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:28.777285    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:28.777319    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:28.777326    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:28.777330    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:28.779470    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:29.276786    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:29.276806    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:29.276818    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:29.276824    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:29.279639    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:29.777308    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:29.777319    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:29.777325    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:29.777328    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:29.779377    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:29.779445    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:30.277508    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:30.277524    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:30.277530    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:30.277535    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:30.279611    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:30.778698    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:30.778722    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:30.778737    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:30.778746    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:30.781722    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:31.276851    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:31.276867    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:31.276876    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:31.276888    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:31.279490    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:31.778105    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:31.778123    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:31.778133    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:31.778137    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:31.780442    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:31.780510    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:32.278437    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:32.278459    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:32.278471    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:32.278476    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:32.281165    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:32.778521    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:32.778597    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:32.778610    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:32.778618    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:32.781925    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:33.276802    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:33.276823    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:33.276832    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:33.276837    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:33.279437    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:33.777585    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:33.777608    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:33.777620    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:33.777629    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:33.780773    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:33.780858    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:34.277701    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:34.277717    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:34.277723    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:34.277726    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:34.279795    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:34.777419    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:34.777432    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:34.777438    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:34.777442    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:34.779621    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:35.276815    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:35.276837    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:35.276847    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:35.276852    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:35.279717    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:35.778287    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:35.778312    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:35.778399    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:35.778409    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:35.781136    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:35.781210    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:36.276900    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:36.276915    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:36.276922    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:36.276925    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:36.279177    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:36.777030    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:36.777056    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:36.777068    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:36.777075    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:36.780399    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:37.276789    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:37.276805    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:37.276814    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:37.276819    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:37.279300    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:37.777098    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:37.777112    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:37.777117    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:37.777121    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:37.779283    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:38.277802    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:38.277865    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:38.277876    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:38.277884    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:38.279839    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:38.279898    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:38.777985    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:38.778008    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:38.778021    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:38.778027    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:38.781190    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:39.278167    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:39.278215    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:39.278222    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:39.278227    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:39.280014    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:39.778411    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:39.778425    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:39.778433    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:39.778437    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:39.780400    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:40.276768    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:40.276779    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:40.276785    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:40.276789    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:40.278622    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:40.776752    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:40.776766    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:40.776792    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:40.776795    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:40.779016    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:40.779098    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:41.278166    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:41.278185    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:41.278202    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:41.278206    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:41.280453    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:41.776879    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:41.776941    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:41.776950    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:41.776956    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:41.779462    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:42.277893    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:42.277906    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:42.277912    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:42.277916    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:42.279774    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:42.776804    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:42.776825    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:42.776836    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:42.776841    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:42.780314    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:42.780388    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:43.277438    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:43.277453    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:43.277461    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:43.277466    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:43.279958    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:43.776777    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:43.776790    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:43.776796    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:43.776799    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:43.778854    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:44.277120    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:44.277141    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:44.277152    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:44.277167    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:44.280063    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:44.777870    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:44.777891    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:44.777902    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:44.777910    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:44.780670    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:44.780806    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:45.278440    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:45.278453    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:45.278459    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:45.278464    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:45.280687    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:45.776997    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:45.777022    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:45.777033    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:45.777045    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:45.779681    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:46.277720    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:46.277761    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:46.277771    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:46.277777    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:46.279827    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:46.777445    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:46.777460    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:46.777466    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:46.777469    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:46.779643    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:47.278055    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:47.278120    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:47.278134    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:47.278141    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:47.281004    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:47.281121    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:47.778899    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:47.778923    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:47.778933    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:47.779001    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:47.781920    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:48.278094    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:48.278140    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:48.278148    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:48.278153    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:48.280253    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:48.776917    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:48.776935    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:48.776947    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:48.776954    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:48.779870    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:49.277147    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:49.277168    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:49.277179    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:49.277186    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:49.279804    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:49.778489    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:49.778501    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:49.778508    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:49.778510    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:49.780670    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:49.780731    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:50.278221    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:50.278248    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:50.278302    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:50.278312    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:50.281268    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:50.777610    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:50.777650    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:50.777663    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:50.777672    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:50.780328    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:51.276863    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:51.276878    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:51.276884    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:51.276887    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:51.278829    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:51.778792    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:51.778815    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:51.778829    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:51.778836    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:51.782105    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:51.782175    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:52.277513    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:52.277532    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:52.277544    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:52.277550    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:52.280450    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:52.778374    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:52.778390    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:52.778396    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:52.778399    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:52.780416    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:53.277640    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:53.277659    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:53.277671    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:53.277677    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:53.280752    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:53.778971    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:53.779023    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:53.779036    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:53.779044    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:53.782509    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:53.782591    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:54.277469    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:54.277484    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:54.277491    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:54.277495    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:54.279585    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:54.778653    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:54.778675    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:54.778688    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:54.778708    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:54.781762    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:55.277208    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:55.277222    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:55.277263    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:55.277270    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:55.279152    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:55.777066    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:55.777102    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:55.777110    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:55.777115    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:55.779288    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:56.278230    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:56.278241    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:56.278248    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:56.278251    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:56.280389    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:56.280448    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:56.778057    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:56.778137    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:56.778151    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:56.778158    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:56.781449    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:39:57.277127    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:57.277141    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:57.277148    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:57.277151    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:57.279114    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:57.778467    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:57.778478    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:57.778485    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:57.778487    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:57.780611    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:58.277035    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:58.277048    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:58.277064    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:58.277069    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:58.284343    3744 round_trippers.go:574] Response Status: 404 Not Found in 7 milliseconds
	I0831 15:39:58.284416    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:39:58.778691    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:58.778707    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:58.778714    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:58.778718    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:58.780664    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:39:59.277786    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:59.277801    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:59.277810    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:59.277815    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:59.280162    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:39:59.777363    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:39:59.777389    3744 round_trippers.go:469] Request Headers:
	I0831 15:39:59.777400    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:39:59.777417    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:39:59.780437    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:00.278216    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:00.278231    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:00.278238    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:00.278241    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:00.280398    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:00.777947    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:00.777973    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:00.777985    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:00.777992    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:00.780895    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:00.780963    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:01.277061    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:01.277081    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:01.277097    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:01.277105    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:01.280071    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:01.778574    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:01.778590    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:01.778596    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:01.778599    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:01.780602    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:02.277039    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:02.277051    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:02.277057    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:02.277060    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:02.279367    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:02.777088    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:02.777113    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:02.777124    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:02.777130    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:02.780010    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:03.277129    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:03.277143    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:03.277150    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:03.277155    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:03.279360    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:03.279419    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:03.779084    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:03.779111    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:03.779120    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:03.779131    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:03.782578    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:04.279084    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:04.279114    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:04.279193    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:04.279209    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:04.282418    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:04.777281    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:04.777294    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:04.777300    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:04.777304    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:04.779400    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:05.277304    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:05.277357    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:05.277370    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:05.277378    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:05.280048    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:05.280120    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:05.777871    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:05.777897    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:05.777908    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:05.777914    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:05.781308    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:06.278341    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:06.278357    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:06.278365    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:06.278369    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:06.280721    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:06.777260    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:06.777278    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:06.777313    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:06.777319    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:06.779441    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:07.277289    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:07.277354    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:07.277368    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:07.277376    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:07.280642    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:07.280705    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:07.777567    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:07.777583    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:07.777589    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:07.777591    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:07.779882    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:08.277957    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:08.277972    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:08.277980    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:08.277985    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:08.280280    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:08.777495    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:08.777523    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:08.777535    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:08.777541    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:08.780074    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:09.278397    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:09.278413    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:09.278419    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:09.278422    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:09.280703    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:09.280789    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:09.778365    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:09.778379    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:09.778388    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:09.778392    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:09.780762    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:10.277879    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:10.277891    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:10.277897    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:10.277900    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:10.279957    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:10.777727    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:10.777743    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:10.777749    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:10.777752    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:10.779982    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:11.277869    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:11.277892    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:11.277908    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:11.277916    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:11.281007    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:11.281122    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:11.777300    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:11.777325    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:11.777375    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:11.777385    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:11.780070    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:12.277419    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:12.277438    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:12.277444    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:12.277450    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:12.279959    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:12.778536    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:12.778559    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:12.778570    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:12.778577    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:12.782121    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:13.277460    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:13.277540    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:13.277558    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:13.277567    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:13.280462    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:13.777386    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:13.777406    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:13.777417    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:13.777423    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:13.779721    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:13.779795    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:14.278352    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:14.278373    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:14.278382    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:14.278386    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:14.280995    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:14.777911    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:14.777931    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:14.777944    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:14.777953    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:14.780609    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:15.277390    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:15.277406    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:15.277413    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:15.277418    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:15.279552    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:15.777171    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:15.777196    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:15.777208    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:15.777213    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:15.780616    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:15.780690    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:16.278393    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:16.278413    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:16.278423    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:16.278431    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:16.281087    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:16.777493    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:16.777505    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:16.777511    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:16.777514    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:16.779511    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:17.277948    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:17.277963    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:17.277971    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:17.277975    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:17.281263    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:17.777610    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:17.777635    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:17.777645    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:17.777652    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:17.780711    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:17.780810    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:18.278407    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:18.278427    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:18.278438    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:18.278445    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:18.280714    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:18.778225    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:18.778250    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:18.778258    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:18.778263    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:18.781566    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:19.278245    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:19.278271    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:19.278341    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:19.278351    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:19.281708    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:19.778206    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:19.778220    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:19.778226    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:19.778231    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:19.780309    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:20.277705    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:20.277724    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:20.277735    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:20.277743    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:20.280797    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:20.280880    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:20.777518    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:20.777542    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:20.777554    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:20.777562    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:20.780637    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:21.277649    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:21.277665    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:21.277671    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:21.277675    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:21.280074    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:21.778048    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:21.778072    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:21.778084    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:21.778090    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:21.781448    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:22.277500    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:22.277519    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:22.277530    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:22.277535    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:22.280641    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:22.778428    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:22.778446    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:22.778455    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:22.778461    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:22.780933    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:22.780991    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:23.277541    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:23.277605    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:23.277620    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:23.277627    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:23.280957    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:23.777433    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:23.777447    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:23.777454    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:23.777457    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:23.779506    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:24.277362    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:24.277385    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:24.277433    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:24.277440    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:24.280068    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:24.778081    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:24.778099    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:24.778111    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:24.778117    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:24.781146    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:24.781249    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:25.278144    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:25.278167    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:25.278178    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:25.278185    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:25.281087    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:25.778478    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:25.778499    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:25.778540    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:25.778545    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:25.780863    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:26.277292    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:26.277320    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:26.277335    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:26.277342    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:26.280115    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:26.777557    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:26.777573    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:26.777581    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:26.777585    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:26.779974    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:27.278458    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:27.278474    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:27.278481    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:27.278484    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:27.280521    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:27.280595    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:27.777967    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:27.777987    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:27.777996    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:27.778001    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:27.780485    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:28.277807    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:28.277826    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:28.277838    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:28.277846    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:28.280427    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:28.777498    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:28.777510    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:28.777516    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:28.777520    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:28.779847    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:29.277964    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:29.277985    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:29.277996    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:29.278002    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:29.280815    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:29.280906    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:29.778537    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:29.778559    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:29.778570    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:29.778575    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:29.781623    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:30.277396    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:30.277412    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:30.277420    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:30.277424    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:30.279862    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:30.778701    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:30.778785    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:30.778800    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:30.778808    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:30.781829    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:31.278707    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:31.278727    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:31.278738    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:31.278744    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:31.281658    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:31.281726    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:31.778169    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:31.778189    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:31.778199    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:31.778205    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:31.781541    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:32.277415    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:32.277446    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:32.277488    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:32.277498    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:32.280928    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:32.777636    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:32.777722    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:32.777736    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:32.777743    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:32.780331    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:33.278774    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:33.278793    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:33.278802    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:33.278807    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:33.281266    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:33.778581    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:33.778604    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:33.778615    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:33.778622    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:33.781819    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:33.781931    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:34.278488    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:34.278512    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:34.278538    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:34.278546    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:34.281635    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:34.777686    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:34.777700    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:34.777708    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:34.777713    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:34.780113    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:35.277895    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:35.277919    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:35.277930    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:35.277935    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:35.281263    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:35.777425    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:35.777449    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:35.777467    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:35.777477    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:35.780717    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:36.279317    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:36.279363    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:36.279373    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:36.279381    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:36.282024    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:36.282088    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:36.777443    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:36.777459    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:36.777468    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:36.777473    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:36.779899    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:37.278285    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:37.278300    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:37.278306    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:37.278311    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:37.280691    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:37.778439    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:37.778466    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:37.778477    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:37.778484    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:37.781678    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:38.279008    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:38.279038    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:38.279051    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:38.279059    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:38.282603    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:38.282694    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:38.778818    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:38.778844    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:38.778855    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:38.778861    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:38.783197    3744 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:40:39.278660    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:39.278672    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:39.278678    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:39.278681    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:39.280786    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:39.777503    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:39.777522    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:39.777535    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:39.777541    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:39.780544    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:40.278292    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:40.278317    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:40.278329    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:40.278337    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:40.281137    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:40.778006    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:40.778032    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:40.778057    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:40.778071    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:40.781057    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:40.781158    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:41.278405    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:41.278469    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:41.278519    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:41.278533    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:41.281715    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:41.777417    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:41.777432    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:41.777438    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:41.777441    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:41.779462    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:42.278930    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:42.278962    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:42.278969    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:42.278974    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:42.280885    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:42.778654    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:42.778673    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:42.778708    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:42.778714    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:42.781210    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:42.781277    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:43.277428    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:43.277444    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:43.277450    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:43.277454    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:43.279641    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:43.778230    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:43.778243    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:43.778248    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:43.778252    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:43.780641    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:44.278516    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:44.278536    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:44.278545    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:44.278550    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:44.280826    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:44.777524    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:44.777543    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:44.777554    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:44.777560    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:44.780897    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:45.279411    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:45.279427    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:45.279433    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:45.279436    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:45.281622    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:45.281684    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:45.779055    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:45.779071    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:45.779078    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:45.779081    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:45.780982    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:46.278769    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:46.278788    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:46.278794    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:46.278799    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:46.280873    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:46.779140    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:46.779158    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:46.779191    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:46.779195    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:46.781223    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:47.277666    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:47.277689    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:47.277725    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:47.277732    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:47.280012    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:47.778363    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:47.778385    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:47.778394    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:47.778399    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:47.780853    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:47.780917    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:48.277895    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:48.277909    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:48.277915    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:48.277917    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:48.279760    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:48.778443    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:48.778469    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:48.778480    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:48.778487    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:48.782101    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:49.278898    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:49.278941    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:49.278948    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:49.278952    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:49.280953    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:49.779196    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:49.779209    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:49.779218    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:49.779222    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:49.781778    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:49.781836    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:50.277693    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:50.277708    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:50.277717    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:50.277721    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:50.279726    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:50.778035    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:50.778058    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:50.778070    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:50.778079    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:50.781019    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:51.277510    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:51.277549    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:51.277564    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:51.277567    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:51.279483    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:51.779060    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:51.779084    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:51.779113    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:51.779118    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:51.781564    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:52.278175    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:52.278187    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:52.278193    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:52.278197    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:52.280098    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:52.280167    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:52.778702    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:52.778717    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:52.778726    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:52.778730    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:52.781143    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:53.278862    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:53.278918    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:53.278925    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:53.278930    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:53.281004    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:53.779375    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:53.779401    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:53.779412    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:53.779418    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:53.783259    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:54.279308    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:54.279324    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:54.279331    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:54.279334    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:54.281428    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:54.281496    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:54.778158    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:54.778177    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:54.778191    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:54.778198    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:54.781197    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:55.277649    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:55.277663    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:55.277668    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:55.277672    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:55.279760    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:55.779263    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:55.779320    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:55.779330    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:55.779335    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:55.781789    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:56.278112    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:56.278124    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:56.278129    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:56.278133    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:56.280134    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:56.777990    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:56.778010    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:56.778022    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:56.778032    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:56.781213    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:40:56.781297    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:57.279143    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:57.279158    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:57.279164    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:57.279168    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:57.280873    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:57.778857    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:57.778881    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:57.778893    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:57.778900    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:57.781501    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:40:58.278300    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:58.278312    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:58.278318    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:58.278324    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:58.280252    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:58.779105    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:58.779139    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:58.779146    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:58.779150    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:58.780953    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:59.277772    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:59.277810    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:59.277817    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:59.277821    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:59.279828    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:40:59.279892    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:40:59.778306    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:40:59.778323    3744 round_trippers.go:469] Request Headers:
	I0831 15:40:59.778334    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:40:59.778339    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:40:59.781562    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:00.279133    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:00.279147    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:00.279154    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:00.279157    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:00.280989    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:00.777674    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:00.777695    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:00.777706    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:00.777714    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:00.780625    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:01.278700    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:01.278712    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:01.278718    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:01.278722    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:01.280680    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:01.280741    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:01.778674    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:01.778695    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:01.778704    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:01.778709    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:01.781227    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:02.278744    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:02.278759    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:02.278764    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:02.278767    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:02.280603    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:02.778554    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:02.778581    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:02.778646    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:02.778654    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:02.781844    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:03.277956    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:03.277979    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:03.277986    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:03.277990    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:03.279854    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:03.777682    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:03.777698    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:03.777704    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:03.777707    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:03.780161    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:03.780218    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:04.278517    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:04.278529    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:04.278535    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:04.278537    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:04.280362    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:04.778969    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:04.778980    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:04.778986    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:04.778989    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:04.782704    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:05.278152    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:05.278165    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:05.278170    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:05.278173    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:05.280542    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:05.778218    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:05.778303    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:05.778320    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:05.778329    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:05.781501    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:05.781585    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:06.277766    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:06.277778    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:06.277784    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:06.277787    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:06.279880    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:06.778001    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:06.778021    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:06.778033    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:06.778039    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:06.781121    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:07.278457    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:07.278468    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:07.278474    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:07.278478    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:07.280352    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:07.778000    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:07.778020    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:07.778031    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:07.778037    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:07.781054    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:08.277960    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:08.277972    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:08.277978    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:08.277982    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:08.279721    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:08.279790    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:08.777988    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:08.778006    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:08.778014    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:08.778019    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:08.780429    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:09.278866    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:09.278887    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:09.278894    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:09.278898    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:09.280928    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:09.777942    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:09.777961    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:09.777972    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:09.777978    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:09.781177    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:10.279287    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:10.279340    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:10.279352    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:10.279358    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:10.281250    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:10.281309    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:10.779851    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:10.779871    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:10.779883    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:10.779888    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:10.783174    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:11.279199    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:11.279213    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:11.279219    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:11.279223    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:11.281075    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:11.778332    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:11.778350    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:11.778359    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:11.778363    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:11.780504    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:12.279627    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:12.279653    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:12.279665    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:12.279671    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:12.282520    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:12.282615    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:12.778391    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:12.778412    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:12.778424    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:12.778428    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:12.781446    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:13.278111    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:13.278123    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:13.278129    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:13.278132    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:13.279887    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:13.779218    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:13.779233    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:13.779239    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:13.779242    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:13.781352    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:14.277806    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:14.277818    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:14.277823    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:14.277827    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:14.279913    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:14.779779    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:14.779797    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:14.779808    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:14.779814    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:14.783141    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:14.783269    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:15.278003    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:15.278017    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:15.278023    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:15.278027    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:15.279730    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:15.778699    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:15.778720    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:15.778731    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:15.778737    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:15.781939    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:16.278805    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:16.278818    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:16.278846    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:16.278851    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:16.280696    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:16.778278    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:16.778298    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:16.778307    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:16.778312    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:16.780692    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:17.278010    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:17.278061    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:17.278071    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:17.278075    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:17.280183    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:17.280244    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:17.779658    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:17.779684    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:17.779696    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:17.779703    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:17.782964    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:18.279131    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:18.279146    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:18.279152    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:18.279155    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:18.281127    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:18.778591    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:18.778613    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:18.778624    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:18.778631    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:18.781947    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:19.278144    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:19.278156    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:19.278162    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:19.278165    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:19.280314    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:19.280374    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:19.779309    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:19.779328    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:19.779339    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:19.779346    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:19.782226    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:20.278897    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:20.278909    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:20.278914    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:20.278917    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:20.280839    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:20.779038    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:20.779071    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:20.779085    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:20.779095    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:20.782073    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:21.278315    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:21.278364    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:21.278371    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:21.278376    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:21.280407    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:21.280468    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:21.778122    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:21.778137    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:21.778144    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:21.778146    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:21.780207    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:22.278547    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:22.278561    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:22.278568    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:22.278571    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:22.280976    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:22.778009    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:22.778029    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:22.778040    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:22.778045    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:22.780889    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:23.278954    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:23.278999    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:23.279008    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:23.279011    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:23.283528    3744 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:41:23.283590    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:23.779486    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:23.779512    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:23.779523    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:23.779536    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:23.782922    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:24.277863    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:24.277876    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:24.277882    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:24.277885    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:24.279860    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:24.779167    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:24.779185    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:24.779196    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:24.779202    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:24.782106    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:25.279100    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:25.279120    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:25.279131    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:25.279139    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:25.282042    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:25.778565    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:25.778640    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:25.778655    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:25.778663    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:25.781719    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:25.781792    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:26.279146    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:26.279182    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:26.279223    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:26.279229    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:26.282148    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:26.778592    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:26.778614    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:26.778626    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:26.778632    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:26.782054    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:27.278821    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:27.278835    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:27.278842    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:27.278845    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:27.281364    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:27.778073    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:27.778100    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:27.778118    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:27.778125    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:27.781324    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:28.277935    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:28.277959    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:28.278022    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:28.278033    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:28.281297    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:28.281465    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:28.778608    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:28.778635    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:28.778648    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:28.778656    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:28.781848    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:29.278110    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:29.278132    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:29.278143    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:29.278148    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:29.281146    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:29.778251    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:29.778265    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:29.778273    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:29.778277    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:29.782398    3744 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:41:30.279687    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:30.279700    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:30.279708    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:30.279712    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:30.282090    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:30.282159    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:30.779599    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:30.779624    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:30.779636    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:30.779642    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:30.783210    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:31.279353    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:31.279366    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:31.279372    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:31.279376    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:31.281276    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:31.779671    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:31.779692    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:31.779704    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:31.779709    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:31.781611    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:32.279371    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:32.279395    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:32.279435    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:32.279442    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:32.282259    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:32.282329    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:32.779427    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:32.779446    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:32.779458    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:32.779463    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:32.782235    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:33.279452    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:33.279465    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:33.279471    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:33.279474    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:33.281321    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:33.778052    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:33.778072    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:33.778083    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:33.778089    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:33.781878    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:34.278548    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:34.278567    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:34.278575    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:34.278584    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:34.281417    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:34.779174    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:34.779193    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:34.779205    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:34.779210    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:34.782050    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:34.782115    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:35.278139    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:35.278152    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:35.278158    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:35.278162    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:35.279993    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:35.779363    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:35.779450    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:35.779465    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:35.779473    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:35.782313    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:36.278357    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:36.278383    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:36.278394    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:36.278400    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:36.281375    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:36.778762    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:36.778799    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:36.778808    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:36.778814    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:36.780954    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:37.279993    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:37.280053    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:37.280067    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:37.280075    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:37.282945    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:37.283021    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:37.779739    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:37.779783    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:37.779790    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:37.779796    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:37.781629    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:38.279147    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:38.279171    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:38.279184    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:38.279190    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:38.281843    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:38.778741    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:38.778764    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:38.778814    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:38.778819    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:38.781350    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:39.279360    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:39.279391    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:39.279399    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:39.279405    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:39.281151    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:39.778714    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:39.778733    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:39.778744    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:39.778752    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:39.781665    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:39.781800    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:40.278379    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:40.278393    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:40.278401    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:40.278405    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:40.280809    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:40.779343    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:40.779384    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:40.779392    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:40.779398    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:40.781388    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:41.279463    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:41.279490    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:41.279503    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:41.279508    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:41.282590    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:41.779242    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:41.779260    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:41.779267    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:41.779272    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:41.781369    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:42.279466    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:42.279483    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:42.279489    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:42.279492    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:42.281217    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:42.281311    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:42.778084    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:42.778101    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:42.778109    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:42.778112    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:42.780674    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:43.279061    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:43.279078    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:43.279088    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:43.279093    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:43.281059    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:43.779095    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:43.779129    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:43.779136    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:43.779138    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:43.781068    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:44.279029    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:44.279048    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:44.279058    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:44.279063    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:44.281431    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:44.281488    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:44.779540    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:44.779553    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:44.779562    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:44.779566    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:44.782120    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:45.278415    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:45.278429    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:45.278440    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:45.278444    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:45.280960    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:45.778255    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:45.778298    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:45.778305    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:45.778309    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:45.780347    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:46.279010    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:46.279030    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:46.279041    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:46.279046    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:46.282148    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:46.282239    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:46.779747    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:46.779768    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:46.779776    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:46.779782    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:46.782151    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:47.278274    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:47.278298    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:47.278339    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:47.278345    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:47.280731    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:47.778365    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:47.778390    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:47.778399    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:47.778408    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:47.781184    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:48.279756    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:48.279775    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:48.279785    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:48.279789    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:48.282380    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:48.282440    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:48.780165    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:48.780186    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:48.780197    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:48.780205    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:48.783195    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:49.278649    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:49.278669    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:49.278680    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:49.278685    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:49.281793    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:49.780041    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:49.780056    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:49.780064    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:49.780069    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:49.782464    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:50.278528    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:50.278538    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:50.278545    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:50.278549    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:50.280284    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:50.778556    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:50.778582    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:50.778591    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:50.778596    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:50.781794    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:50.781879    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:51.278412    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:51.278448    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:51.278456    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:51.278459    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:51.280359    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:51.778770    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:51.778852    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:51.778867    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:51.778876    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:51.781710    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:52.279069    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:52.279089    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:52.279101    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:52.279107    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:52.282688    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:52.778612    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:52.778627    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:52.778634    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:52.778636    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:52.780790    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:53.278839    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:53.278918    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:53.278932    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:53.278939    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:53.281791    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:53.281864    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:53.778930    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:53.778944    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:53.778953    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:53.778998    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:53.780750    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:54.279141    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:54.279163    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:54.279202    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:54.279208    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:54.281212    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:54.780288    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:54.780307    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:54.780318    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:54.780326    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:54.783446    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:55.278636    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:55.278655    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:55.278669    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:55.278675    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:55.281304    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:55.778487    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:55.778506    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:55.778513    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:55.778517    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:55.780794    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:55.780852    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:56.279529    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:56.279542    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:56.279548    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:56.279552    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:56.281403    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:56.779390    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:56.779415    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:56.779427    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:56.779435    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:56.782652    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:57.279730    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:57.279749    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:57.279775    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:57.279778    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:57.282199    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:57.778341    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:57.778353    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:57.778360    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:57.778364    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:57.780339    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:58.280180    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:58.280200    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:58.280208    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:58.280212    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:58.283270    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:41:58.283333    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:41:58.778922    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:58.778934    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:58.778941    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:58.778944    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:58.781165    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:41:59.278658    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:59.278670    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:59.278677    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:59.278680    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:59.280526    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:41:59.780251    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:41:59.780269    3744 round_trippers.go:469] Request Headers:
	I0831 15:41:59.780278    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:41:59.780285    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:41:59.783254    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:00.278299    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:00.278311    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:00.278318    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:00.278321    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:00.280462    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:00.778333    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:00.778357    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:00.778417    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:00.778425    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:00.781396    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:00.781503    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:01.279261    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:01.279281    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:01.279292    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:01.279299    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:01.282233    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:01.778447    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:01.778464    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:01.778472    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:01.778476    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:01.780643    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:02.278526    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:02.278545    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:02.278557    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:02.278563    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:02.281443    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:02.778669    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:02.778693    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:02.778704    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:02.778709    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:02.782028    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:02.782104    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:03.278662    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:03.278675    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:03.278681    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:03.278684    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:03.281034    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:03.779554    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:03.779600    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:03.779611    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:03.779619    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:03.782537    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:04.278499    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:04.278522    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:04.278534    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:04.278542    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:04.281683    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:04.779122    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:04.779133    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:04.779140    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:04.779143    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:04.781151    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:42:05.279493    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:05.279515    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:05.279527    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:05.279535    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:05.283494    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:05.283569    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:05.779088    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:05.779167    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:05.779181    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:05.779187    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:05.782371    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:06.279314    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:06.279355    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:06.279363    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:06.279369    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:06.281532    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:06.779431    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:06.779454    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:06.779465    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:06.779473    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:06.782521    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:07.279058    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:07.279070    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:07.279078    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:07.279083    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:07.281403    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:07.780066    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:07.780081    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:07.780088    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:07.780092    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:07.782413    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:07.782477    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:08.278582    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:08.278601    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:08.278612    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:08.278617    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:08.281655    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:08.779457    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:08.779482    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:08.779494    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:08.779500    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:08.782874    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:09.278624    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:09.278660    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:09.278667    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:09.278671    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:09.280685    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:09.780183    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:09.780196    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:09.780204    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:09.780208    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:09.782479    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:09.782566    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:10.279033    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:10.279051    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:10.279074    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:10.279077    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:10.281035    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:42:10.778903    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:10.778916    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:10.778923    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:10.778926    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:10.781070    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:11.279519    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:11.279545    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:11.279587    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:11.279593    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:11.281932    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:11.780008    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:11.780026    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:11.780035    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:11.780041    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:11.782405    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:12.279415    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:12.279432    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:12.279438    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:12.279441    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:12.283584    3744 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:42:12.283720    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:12.779135    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:12.779163    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:12.779182    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:12.779192    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:12.782385    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:13.279985    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:13.280010    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:13.280074    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:13.280083    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:13.287032    3744 round_trippers.go:574] Response Status: 404 Not Found in 6 milliseconds
	I0831 15:42:13.778812    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:13.778824    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:13.778832    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:13.778837    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:13.780875    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:14.278421    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:14.278446    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:14.278459    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:14.278468    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:14.281130    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:14.778988    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:14.779006    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:14.779017    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:14.779025    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:14.782314    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:14.782385    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:15.279457    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:15.279477    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:15.279486    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:15.279492    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:15.281789    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:15.779420    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:15.779447    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:15.779459    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:15.779465    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:15.782822    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:16.278493    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:16.278512    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:16.278521    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:16.278526    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:16.280744    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:16.779399    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:16.779415    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:16.779421    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:16.779424    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:16.781497    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:17.279997    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:17.280026    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:17.280038    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:17.280046    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:17.283600    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:17.283684    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:17.778578    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:17.778593    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:17.778641    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:17.778645    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:17.780800    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:18.279627    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:18.279643    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:18.279650    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:18.279653    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:18.281669    3744 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:42:18.778603    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:18.778615    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:18.778621    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:18.778625    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:18.781667    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:19.279738    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:19.279765    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:19.279777    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:19.279785    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:19.282926    3744 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:42:19.779767    3744 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:42:19.779781    3744 round_trippers.go:469] Request Headers:
	I0831 15:42:19.779788    3744 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:42:19.779791    3744 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:42:19.781778    3744 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:42:19.781867    3744 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:42:19.781882    3744 node_ready.go:38] duration metric: took 4m0.003563812s for node "ha-949000-m04" to be "Ready" ...
	I0831 15:42:19.812481    3744 out.go:201] 
	W0831 15:42:19.833493    3744 out.go:270] X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	W0831 15:42:19.833512    3744 out.go:270] * 
	W0831 15:42:19.834711    3744 out.go:293] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0831 15:42:19.917735    3744 out.go:201] 
	
	
	==> Docker <==
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.219890563Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.220222326Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:37:15 ha-949000 cri-dockerd[1422]: time="2024-08-31T22:37:15Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/b2a8128cbfc292835f200d6551b039f9078ca4bc34012a439cb84e9977fa736b/resolv.conf as [nameserver 192.169.0.1]"
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.321266510Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.321331709Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.321344565Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.321411223Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:37:15 ha-949000 cri-dockerd[1422]: time="2024-08-31T22:37:15Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/eb9132907eda4d53e71edd7c7c0cba6cb88a38299639a216ab3394c1ee636b08/resolv.conf as [nameserver 192.169.0.1]"
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.533698709Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.533801876Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.533841143Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.535981528Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:37:15 ha-949000 cri-dockerd[1422]: time="2024-08-31T22:37:15Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/88b8aff8a006d67d53ddbefdb7171c2dba6d6b8082457d8b875b0980fe0a3f82/resolv.conf as [nameserver 10.96.0.10 search default.svc.cluster.local svc.cluster.local cluster.local options ndots:5]"
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.781886172Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.781967190Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.782044910Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:37:15 ha-949000 dockerd[1175]: time="2024-08-31T22:37:15.782180434Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:37:45 ha-949000 dockerd[1168]: time="2024-08-31T22:37:45.904555766Z" level=info msg="ignoring event" container=c7ade311e2b6bcc0e1f37e83b236eaec5caafb139b65d92f8114faaed4aacb77 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Aug 31 22:37:45 ha-949000 dockerd[1175]: time="2024-08-31T22:37:45.905026545Z" level=info msg="shim disconnected" id=c7ade311e2b6bcc0e1f37e83b236eaec5caafb139b65d92f8114faaed4aacb77 namespace=moby
	Aug 31 22:37:45 ha-949000 dockerd[1175]: time="2024-08-31T22:37:45.905076623Z" level=warning msg="cleaning up after shim disconnected" id=c7ade311e2b6bcc0e1f37e83b236eaec5caafb139b65d92f8114faaed4aacb77 namespace=moby
	Aug 31 22:37:45 ha-949000 dockerd[1175]: time="2024-08-31T22:37:45.905085418Z" level=info msg="cleaning up dead shim" namespace=moby
	Aug 31 22:37:58 ha-949000 dockerd[1175]: time="2024-08-31T22:37:58.377002915Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:37:58 ha-949000 dockerd[1175]: time="2024-08-31T22:37:58.377073590Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:37:58 ha-949000 dockerd[1175]: time="2024-08-31T22:37:58.377087074Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:37:58 ha-949000 dockerd[1175]: time="2024-08-31T22:37:58.377452368Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                 CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	9743646580e07       6e38f40d628db                                                                                         4 minutes ago       Running             storage-provisioner       2                   e485647500358       storage-provisioner
	f5deb862745e4       8c811b4aec35f                                                                                         5 minutes ago       Running             busybox                   1                   88b8aff8a006d       busybox-7dff88458-5kkbw
	f89b862064139       ad83b2ca7b09e                                                                                         5 minutes ago       Running             kube-proxy                1                   eb9132907eda4       kube-proxy-q7ndn
	ac487ac32c364       cbb01a7bd410d                                                                                         5 minutes ago       Running             coredns                   1                   b2a8128cbfc29       coredns-6f6b679f8f-snq8s
	ff98d7e38a1e6       12968670680f4                                                                                         5 minutes ago       Running             kindnet-cni               1                   fc1aa95e54f86       kindnet-jzj42
	c4dc6059b2150       cbb01a7bd410d                                                                                         5 minutes ago       Running             coredns                   1                   9b710526ef4f9       coredns-6f6b679f8f-kjszm
	c7ade311e2b6b       6e38f40d628db                                                                                         5 minutes ago       Exited              storage-provisioner       1                   e485647500358       storage-provisioner
	3dd9e3bd3e1f5       045733566833c                                                                                         5 minutes ago       Running             kube-controller-manager   2                   5f88515d4139e       kube-controller-manager-ha-949000
	5b0ac6b7faf7d       1766f54c897f0                                                                                         5 minutes ago       Running             kube-scheduler            1                   6e330e66cf27f       kube-scheduler-ha-949000
	fa476ce36b900       604f5db92eaa8                                                                                         5 minutes ago       Running             kube-apiserver            1                   05f6f2cfbf46d       kube-apiserver-ha-949000
	2255978551ea3       2e96e5913fc06                                                                                         5 minutes ago       Running             etcd                      1                   d62930734f2f9       etcd-ha-949000
	740de9cc660e2       045733566833c                                                                                         5 minutes ago       Exited              kube-controller-manager   1                   5f88515d4139e       kube-controller-manager-ha-949000
	0bb147eb5f408       38af8ddebf499                                                                                         5 minutes ago       Running             kube-vip                  0                   9ac139ab4844d       kube-vip-ha-949000
	2f925f16b74b0       gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12   10 minutes ago      Exited              busybox                   0                   f68483c946835       busybox-7dff88458-5kkbw
	b1db836cd7a3d       cbb01a7bd410d                                                                                         12 minutes ago      Exited              coredns                   0                   271da20951c9a       coredns-6f6b679f8f-kjszm
	def4d6bd20bc5       cbb01a7bd410d                                                                                         12 minutes ago      Exited              coredns                   0                   1017bd5eac1d2       coredns-6f6b679f8f-snq8s
	6d156ce626115       kindest/kindnetd@sha256:e59a687ca28ae274a2fc92f1e2f5f1c739f353178a43a23aafc71adb802ed166              12 minutes ago      Exited              kindnet-cni               0                   7d1851c17485c       kindnet-jzj42
	54d5f8041c89d       ad83b2ca7b09e                                                                                         12 minutes ago      Exited              kube-proxy                0                   4b0198ac7dc52       kube-proxy-q7ndn
	c734c23a53082       2e96e5913fc06                                                                                         12 minutes ago      Exited              etcd                      0                   7cfaf9f5d4dd4       etcd-ha-949000
	02c10e4f765d1       1766f54c897f0                                                                                         12 minutes ago      Exited              kube-scheduler            0                   c084f2a259f6c       kube-scheduler-ha-949000
	ffec6106be6c8       604f5db92eaa8                                                                                         12 minutes ago      Exited              kube-apiserver            0                   25c49852f78dc       kube-apiserver-ha-949000
	
	
	==> coredns [ac487ac32c36] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 257e111468ef6f1e36f10df061303186c353cd0e51aed8f50f4e4fd21cec02687aef97084fe1f82262f5cee88179d311670a6ae21ae185759728216fc264125f
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:37668 - 17883 "HINFO IN 4931414995021238036.4254872758042696539. udp 57 false 512" NXDOMAIN qr,rd,ra 132 0.026863898s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1645472327]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.837) (total time: 30003ms):
	Trace[1645472327]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30002ms (22:37:45.839)
	Trace[1645472327]: [30.003429832s] [30.003429832s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[2054948566]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.838) (total time: 30003ms):
	Trace[2054948566]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30003ms (22:37:45.841)
	Trace[2054948566]: [30.003549662s] [30.003549662s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[850581595]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.840) (total time: 30001ms):
	Trace[850581595]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (22:37:45.841)
	Trace[850581595]: [30.001289039s] [30.001289039s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [b1db836cd7a3] <==
	[INFO] 10.244.1.2:58757 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000418868s
	[INFO] 10.244.1.2:39299 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000067106s
	[INFO] 10.244.2.2:56948 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000080585s
	[INFO] 10.244.2.2:56973 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000078985s
	[INFO] 10.244.2.2:43081 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000100123s
	[INFO] 10.244.2.2:56390 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000040214s
	[INFO] 10.244.2.2:52519 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000061255s
	[INFO] 10.244.0.4:36226 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000151133s
	[INFO] 10.244.1.2:44017 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089111s
	[INFO] 10.244.1.2:37224 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000069144s
	[INFO] 10.244.1.2:51282 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000118723s
	[INFO] 10.244.2.2:35009 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089507s
	[INFO] 10.244.2.2:60607 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000049176s
	[INFO] 10.244.2.2:36851 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000097758s
	[INFO] 10.244.0.4:59717 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000053986s
	[INFO] 10.244.0.4:58447 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000060419s
	[INFO] 10.244.1.2:60381 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000136898s
	[INFO] 10.244.1.2:32783 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.00010303s
	[INFO] 10.244.1.2:44904 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000042493s
	[INFO] 10.244.1.2:44085 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000132084s
	[INFO] 10.244.2.2:43635 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000080947s
	[INFO] 10.244.2.2:40020 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000081919s
	[INFO] 10.244.2.2:53730 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000058015s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> coredns [c4dc6059b215] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 257e111468ef6f1e36f10df061303186c353cd0e51aed8f50f4e4fd21cec02687aef97084fe1f82262f5cee88179d311670a6ae21ae185759728216fc264125f
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:55597 - 61955 "HINFO IN 5411809642052316829.545085282119266902. udp 56 false 512" NXDOMAIN qr,rd,ra 131 0.026601414s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1248174265]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.837) (total time: 30003ms):
	Trace[1248174265]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30002ms (22:37:45.839)
	Trace[1248174265]: [30.003765448s] [30.003765448s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[313955954]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.840) (total time: 30001ms):
	Trace[313955954]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (22:37:45.841)
	Trace[313955954]: [30.001623019s] [30.001623019s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1099528094]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.837) (total time: 30004ms):
	Trace[1099528094]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30004ms (22:37:45.842)
	Trace[1099528094]: [30.004679878s] [30.004679878s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [def4d6bd20bc] <==
	[INFO] 10.244.1.2:55576 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.000574417s
	[INFO] 10.244.1.2:36293 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000065455s
	[INFO] 10.244.2.2:41223 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000063892s
	[INFO] 10.244.0.4:54135 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000096141s
	[INFO] 10.244.0.4:39176 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000742646s
	[INFO] 10.244.0.4:58445 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000080113s
	[INFO] 10.244.0.4:56242 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000066269s
	[INFO] 10.244.0.4:60657 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000049645s
	[INFO] 10.244.1.2:48306 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000561931s
	[INFO] 10.244.1.2:40767 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000077826s
	[INFO] 10.244.1.2:35669 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000056994s
	[INFO] 10.244.1.2:57720 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000040565s
	[INFO] 10.244.2.2:38794 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000136901s
	[INFO] 10.244.2.2:33576 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000052374s
	[INFO] 10.244.2.2:57053 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000051289s
	[INFO] 10.244.0.4:47623 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000056903s
	[INFO] 10.244.0.4:59818 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.00003011s
	[INFO] 10.244.0.4:53586 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000029565s
	[INFO] 10.244.1.2:60045 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000060878s
	[INFO] 10.244.2.2:38400 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000078624s
	[INFO] 10.244.0.4:58765 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000075707s
	[INFO] 10.244.0.4:32804 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000050785s
	[INFO] 10.244.2.2:48459 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.00007773s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> describe nodes <==
	Name:               ha-949000
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-949000
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=ha-949000
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_08_31T15_29_45_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:29:41 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-949000
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:42:33 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:42:12 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:42:12 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:42:12 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:42:12 +0000   Sat, 31 Aug 2024 22:37:06 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.5
	  Hostname:    ha-949000
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 199c42a1ef3943388f047673dca52741
	  System UUID:                98ca49d1-0000-0000-9e6c-321a4533d56e
	  Boot ID:                    ede31f27-0dff-4107-9a48-7cb2c0328412
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-5kkbw              0 (0%)        0 (0%)      0 (0%)           0 (0%)         10m
	  kube-system                 coredns-6f6b679f8f-kjszm             100m (5%)     0 (0%)      70Mi (3%)        170Mi (8%)     12m
	  kube-system                 coredns-6f6b679f8f-snq8s             100m (5%)     0 (0%)      70Mi (3%)        170Mi (8%)     12m
	  kube-system                 etcd-ha-949000                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         12m
	  kube-system                 kindnet-jzj42                        100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      12m
	  kube-system                 kube-apiserver-ha-949000             250m (12%)    0 (0%)      0 (0%)           0 (0%)         12m
	  kube-system                 kube-controller-manager-ha-949000    200m (10%)    0 (0%)      0 (0%)           0 (0%)         12m
	  kube-system                 kube-proxy-q7ndn                     0 (0%)        0 (0%)      0 (0%)           0 (0%)         12m
	  kube-system                 kube-scheduler-ha-949000             100m (5%)     0 (0%)      0 (0%)           0 (0%)         12m
	  kube-system                 kube-vip-ha-949000                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         5m20s
	  kube-system                 storage-provisioner                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         12m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%)   100m (5%)
	  memory             290Mi (13%)  390Mi (18%)
	  ephemeral-storage  0 (0%)       0 (0%)
	  hugepages-2Mi      0 (0%)       0 (0%)
	Events:
	  Type    Reason                   Age                  From             Message
	  ----    ------                   ----                 ----             -------
	  Normal  Starting                 12m                  kube-proxy       
	  Normal  Starting                 5m17s                kube-proxy       
	  Normal  Starting                 12m                  kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  12m                  kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  12m                  kubelet          Node ha-949000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    12m                  kubelet          Node ha-949000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     12m                  kubelet          Node ha-949000 status is now: NodeHasSufficientPID
	  Normal  RegisteredNode           12m                  node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  NodeReady                12m                  kubelet          Node ha-949000 status is now: NodeReady
	  Normal  RegisteredNode           11m                  node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           10m                  node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           8m5s                 node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  NodeHasSufficientMemory  6m6s (x8 over 6m6s)  kubelet          Node ha-949000 status is now: NodeHasSufficientMemory
	  Normal  Starting                 6m6s                 kubelet          Starting kubelet.
	  Normal  NodeHasNoDiskPressure    6m6s (x8 over 6m6s)  kubelet          Node ha-949000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     6m6s (x7 over 6m6s)  kubelet          Node ha-949000 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  6m6s                 kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           5m35s                node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           5m17s                node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           4m51s                node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	
	
	Name:               ha-949000-m02
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-949000-m02
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=ha-949000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_08_31T15_30_43_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:30:41 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-949000-m02
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:42:32 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:42:01 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:42:01 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:42:01 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:42:01 +0000   Sat, 31 Aug 2024 22:31:00 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.6
	  Hostname:    ha-949000-m02
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 86a1a86d2cdf4cba8c80d25d466d7a14
	  System UUID:                23e54f3d-0000-0000-86b7-b25c818528d1
	  Boot ID:                    eb3152fc-98b8-4334-9705-7b182a7d2f78
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-6r9s5                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         10m
	  kube-system                 etcd-ha-949000-m02                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         11m
	  kube-system                 kindnet-brtj6                            100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      11m
	  kube-system                 kube-apiserver-ha-949000-m02             250m (12%)    0 (0%)      0 (0%)           0 (0%)         11m
	  kube-system                 kube-controller-manager-ha-949000-m02    200m (10%)    0 (0%)      0 (0%)           0 (0%)         11m
	  kube-system                 kube-proxy-4r2bt                         0 (0%)        0 (0%)      0 (0%)           0 (0%)         11m
	  kube-system                 kube-scheduler-ha-949000-m02             100m (5%)     0 (0%)      0 (0%)           0 (0%)         11m
	  kube-system                 kube-vip-ha-949000-m02                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         11m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%)  100m (5%)
	  memory             150Mi (7%)  50Mi (2%)
	  ephemeral-storage  0 (0%)      0 (0%)
	  hugepages-2Mi      0 (0%)      0 (0%)
	Events:
	  Type     Reason                   Age                    From             Message
	  ----     ------                   ----                   ----             -------
	  Normal   Starting                 5m36s                  kube-proxy       
	  Normal   Starting                 8m7s                   kube-proxy       
	  Normal   Starting                 11m                    kube-proxy       
	  Normal   NodeAllocatableEnforced  11m                    kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  11m (x8 over 11m)      kubelet          Node ha-949000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    11m (x8 over 11m)      kubelet          Node ha-949000-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     11m (x7 over 11m)      kubelet          Node ha-949000-m02 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           11m                    node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   RegisteredNode           11m                    node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   RegisteredNode           10m                    node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   Starting                 8m12s                  kubelet          Starting kubelet.
	  Warning  Rebooted                 8m12s                  kubelet          Node ha-949000-m02 has been rebooted, boot id: 4ddbe4b0-7ef0-4715-a631-f977c123c463
	  Normal   NodeHasSufficientPID     8m12s                  kubelet          Node ha-949000-m02 status is now: NodeHasSufficientPID
	  Normal   NodeAllocatableEnforced  8m12s                  kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  8m12s                  kubelet          Node ha-949000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    8m12s                  kubelet          Node ha-949000-m02 status is now: NodeHasNoDiskPressure
	  Normal   RegisteredNode           8m5s                   node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   Starting                 5m48s                  kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  5m48s                  kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  5m47s (x8 over 5m48s)  kubelet          Node ha-949000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    5m47s (x8 over 5m48s)  kubelet          Node ha-949000-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     5m47s (x7 over 5m48s)  kubelet          Node ha-949000-m02 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           5m35s                  node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   RegisteredNode           5m17s                  node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   RegisteredNode           4m51s                  node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	
	
	==> dmesg <==
	[  +0.000001] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.036538] ACPI BIOS Warning (bug): Incorrect checksum in table [DSDT] - 0xBE, should be 0x1B (20200925/tbprint-173)
	[  +0.008025] RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible!
	[  +5.657655] ACPI Error: Could not enable RealTimeClock event (20200925/evxfevnt-182)
	[  +0.000002] ACPI Warning: Could not enable fixed event - RealTimeClock (4) (20200925/evxface-618)
	[  +0.007505] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.775908] systemd-fstab-generator[127]: Ignoring "noauto" option for root device
	[  +2.226303] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000003] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000001] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +0.522399] systemd-fstab-generator[463]: Ignoring "noauto" option for root device
	[  +0.101678] systemd-fstab-generator[475]: Ignoring "noauto" option for root device
	[  +1.969329] systemd-fstab-generator[1097]: Ignoring "noauto" option for root device
	[  +0.262499] systemd-fstab-generator[1134]: Ignoring "noauto" option for root device
	[  +0.055714] kauditd_printk_skb: 101 callbacks suppressed
	[  +0.044427] systemd-fstab-generator[1146]: Ignoring "noauto" option for root device
	[  +0.122906] systemd-fstab-generator[1160]: Ignoring "noauto" option for root device
	[  +2.475814] systemd-fstab-generator[1375]: Ignoring "noauto" option for root device
	[  +0.112565] systemd-fstab-generator[1387]: Ignoring "noauto" option for root device
	[  +0.102686] systemd-fstab-generator[1399]: Ignoring "noauto" option for root device
	[  +0.126445] systemd-fstab-generator[1414]: Ignoring "noauto" option for root device
	[  +0.454968] systemd-fstab-generator[1576]: Ignoring "noauto" option for root device
	[  +6.916629] kauditd_printk_skb: 212 callbacks suppressed
	[ +21.586391] kauditd_printk_skb: 40 callbacks suppressed
	[Aug31 22:37] kauditd_printk_skb: 83 callbacks suppressed
	
	
	==> etcd [2255978551ea] <==
	{"level":"info","ts":"2024-08-31T22:37:38.086155Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:37:38.088468Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:37:38.103271Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"6bcd180d94f2f42","stream-type":"stream Message"}
	{"level":"info","ts":"2024-08-31T22:37:38.103349Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:37:38.121926Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"6bcd180d94f2f42","stream-type":"stream MsgApp v2"}
	{"level":"info","ts":"2024-08-31T22:37:38.122172Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:42:23.041758Z","caller":"traceutil/trace.go:171","msg":"trace[2063191983] transaction","detail":"{read_only:false; response_revision:2752; number_of_response:1; }","duration":"116.206759ms","start":"2024-08-31T22:42:22.925538Z","end":"2024-08-31T22:42:23.041745Z","steps":["trace[2063191983] 'process raft request'  (duration: 25.985135ms)","trace[2063191983] 'compare'  (duration: 90.0559ms)"],"step_count":2}
	{"level":"info","ts":"2024-08-31T22:42:29.013623Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 switched to configuration voters=(3559962241544385584 13314548521573537860)"}
	{"level":"info","ts":"2024-08-31T22:42:29.015476Z","caller":"membership/cluster.go:472","msg":"removed member","cluster-id":"b73189effde9bc63","local-member-id":"b8c6c7563d17d844","removed-remote-peer-id":"6bcd180d94f2f42","removed-remote-peer-urls":["https://192.169.0.7:2380"]}
	{"level":"info","ts":"2024-08-31T22:42:29.015640Z","caller":"rafthttp/peer.go:330","msg":"stopping remote peer","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"warn","ts":"2024-08-31T22:42:29.016344Z","caller":"etcdserver/server.go:987","msg":"rejected Raft message from removed member","local-member-id":"b8c6c7563d17d844","removed-member-id":"6bcd180d94f2f42"}
	{"level":"warn","ts":"2024-08-31T22:42:29.016572Z","caller":"rafthttp/peer.go:180","msg":"failed to process Raft message","error":"cannot process message from removed member"}
	{"level":"warn","ts":"2024-08-31T22:42:29.016239Z","caller":"rafthttp/stream.go:286","msg":"closed TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:42:29.017133Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"warn","ts":"2024-08-31T22:42:29.017406Z","caller":"rafthttp/stream.go:286","msg":"closed TCP streaming connection with remote peer","stream-writer-type":"stream Message","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:42:29.017680Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream Message","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:42:29.017856Z","caller":"rafthttp/pipeline.go:85","msg":"stopped HTTP pipelining with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"warn","ts":"2024-08-31T22:42:29.018196Z","caller":"rafthttp/stream.go:421","msg":"lost TCP streaming connection with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42","error":"context canceled"}
	{"level":"warn","ts":"2024-08-31T22:42:29.018320Z","caller":"rafthttp/peer_status.go:66","msg":"peer became inactive (message send to peer failed)","peer-id":"6bcd180d94f2f42","error":"failed to read 6bcd180d94f2f42 on stream MsgApp v2 (context canceled)"}
	{"level":"info","ts":"2024-08-31T22:42:29.018416Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"warn","ts":"2024-08-31T22:42:29.018683Z","caller":"rafthttp/stream.go:421","msg":"lost TCP streaming connection with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42","error":"context canceled"}
	{"level":"info","ts":"2024-08-31T22:42:29.018841Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:42:29.018977Z","caller":"rafthttp/peer.go:335","msg":"stopped remote peer","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:42:29.019027Z","caller":"rafthttp/transport.go:355","msg":"removed remote peer","local-member-id":"b8c6c7563d17d844","removed-remote-peer-id":"6bcd180d94f2f42"}
	{"level":"warn","ts":"2024-08-31T22:42:29.030289Z","caller":"embed/config_logging.go:170","msg":"rejected connection on peer endpoint","remote-addr":"192.169.0.7:55160","server-name":"","error":"EOF"}
	
	
	==> etcd [c734c23a5308] <==
	{"level":"info","ts":"2024-08-31T22:36:02.089341Z","caller":"traceutil/trace.go:171","msg":"trace[1950473945] range","detail":"{range_begin:/registry/secrets/; range_end:/registry/secrets0; }","duration":"5.07880235s","start":"2024-08-31T22:35:57.010534Z","end":"2024-08-31T22:36:02.089336Z","steps":["trace[1950473945] 'agreement among raft nodes before linearized reading'  (duration: 5.078744702s)"],"step_count":1}
	{"level":"warn","ts":"2024-08-31T22:36:02.089376Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-08-31T22:35:57.010497Z","time spent":"5.078873485s","remote":"127.0.0.1:50354","response type":"/etcdserverpb.KV/Range","request count":0,"request size":42,"response count":0,"response size":0,"request content":"key:\"/registry/secrets/\" range_end:\"/registry/secrets0\" count_only:true "}
	2024/08/31 22:36:02 WARNING: [core] [Server #8] grpc: Server.processUnaryRPC failed to write status: connection error: desc = "transport is closing"
	{"level":"warn","ts":"2024-08-31T22:36:02.089450Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"3.731895172s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/statefulsets/\" range_end:\"/registry/statefulsets0\" count_only:true ","response":"","error":"context canceled"}
	{"level":"info","ts":"2024-08-31T22:36:02.089464Z","caller":"traceutil/trace.go:171","msg":"trace[1668294552] range","detail":"{range_begin:/registry/statefulsets/; range_end:/registry/statefulsets0; }","duration":"3.731928485s","start":"2024-08-31T22:35:58.357532Z","end":"2024-08-31T22:36:02.089460Z","steps":["trace[1668294552] 'agreement among raft nodes before linearized reading'  (duration: 3.731895116s)"],"step_count":1}
	{"level":"warn","ts":"2024-08-31T22:36:02.089476Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-08-31T22:35:58.357516Z","time spent":"3.731956501s","remote":"127.0.0.1:50712","response type":"/etcdserverpb.KV/Range","request count":0,"request size":52,"response count":0,"response size":0,"request content":"key:\"/registry/statefulsets/\" range_end:\"/registry/statefulsets0\" count_only:true "}
	2024/08/31 22:36:02 WARNING: [core] [Server #8] grpc: Server.processUnaryRPC failed to write status: connection error: desc = "transport is closing"
	{"level":"info","ts":"2024-08-31T22:36:02.126515Z","caller":"etcdserver/server.go:1512","msg":"skipped leadership transfer; local server is not leader","local-member-id":"b8c6c7563d17d844","current-leader-member-id":"0"}
	{"level":"info","ts":"2024-08-31T22:36:02.127073Z","caller":"rafthttp/peer.go:330","msg":"stopping remote peer","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:36:02.127125Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:36:02.127142Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream Message","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:36:02.127279Z","caller":"rafthttp/pipeline.go:85","msg":"stopped HTTP pipelining with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:36:02.127328Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:36:02.127353Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:36:02.127363Z","caller":"rafthttp/peer.go:335","msg":"stopped remote peer","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:36:02.127367Z","caller":"rafthttp/peer.go:330","msg":"stopping remote peer","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:36:02.127373Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:36:02.127406Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream Message","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:36:02.127962Z","caller":"rafthttp/pipeline.go:85","msg":"stopped HTTP pipelining with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:36:02.128009Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:36:02.128078Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:36:02.128107Z","caller":"rafthttp/peer.go:335","msg":"stopped remote peer","remote-peer-id":"6bcd180d94f2f42"}
	{"level":"info","ts":"2024-08-31T22:36:02.129535Z","caller":"embed/etcd.go:581","msg":"stopping serving peer traffic","address":"192.169.0.5:2380"}
	{"level":"info","ts":"2024-08-31T22:36:02.129687Z","caller":"embed/etcd.go:586","msg":"stopped serving peer traffic","address":"192.169.0.5:2380"}
	{"level":"info","ts":"2024-08-31T22:36:02.129696Z","caller":"embed/etcd.go:379","msg":"closed etcd server","name":"ha-949000","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.169.0.5:2380"],"advertise-client-urls":["https://192.169.0.5:2379"]}
	
	
	==> kernel <==
	 22:42:34 up 6 min,  0 users,  load average: 0.17, 0.32, 0.18
	Linux ha-949000 5.10.207 #1 SMP Wed Aug 28 20:54:17 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [6d156ce62611] <==
	I0831 22:35:15.620720       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:35:25.613908       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:35:25.614028       1 main.go:299] handling current node
	I0831 22:35:25.614079       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:35:25.614094       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:35:25.614736       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:35:25.614790       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:35:35.621230       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:35:35.621411       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:35:35.621574       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:35:35.621705       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:35:35.621830       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:35:35.621998       1 main.go:299] handling current node
	I0831 22:35:45.622596       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:35:45.622733       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:35:45.623036       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:35:45.623089       1 main.go:299] handling current node
	I0831 22:35:45.623265       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:35:45.623338       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:35:55.614888       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:35:55.614962       1 main.go:299] handling current node
	I0831 22:35:55.614980       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:35:55.614989       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:35:55.615216       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:35:55.615320       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	
	
	==> kindnet [ff98d7e38a1e] <==
	I0831 22:41:46.423426       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:41:56.421263       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:41:56.421342       1 main.go:299] handling current node
	I0831 22:41:56.421361       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:41:56.421371       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:41:56.421483       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:41:56.421556       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:42:06.419300       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:42:06.419355       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:42:06.419448       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:42:06.419540       1 main.go:299] handling current node
	I0831 22:42:06.419587       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:42:06.419596       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:42:16.418758       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:42:16.418878       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:42:16.419144       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:42:16.419199       1 main.go:299] handling current node
	I0831 22:42:16.419230       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:42:16.419256       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:42:26.418790       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:42:26.418914       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:42:26.419229       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:42:26.419399       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:42:26.419700       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:42:26.419804       1 main.go:299] handling current node
	
	
	==> kube-apiserver [fa476ce36b90] <==
	I0831 22:36:55.851684       1 controller.go:119] Starting legacy_token_tracking_controller
	I0831 22:36:55.873485       1 shared_informer.go:313] Waiting for caches to sync for configmaps
	I0831 22:36:55.948972       1 apf_controller.go:382] Running API Priority and Fairness config worker
	I0831 22:36:55.949005       1 apf_controller.go:385] Running API Priority and Fairness periodic rebalancing process
	I0831 22:36:55.949434       1 shared_informer.go:320] Caches are synced for cluster_authentication_trust_controller
	I0831 22:36:55.949812       1 cache.go:39] Caches are synced for RemoteAvailability controller
	I0831 22:36:55.953147       1 cache.go:39] Caches are synced for LocalAvailability controller
	I0831 22:36:55.953575       1 cache.go:39] Caches are synced for APIServiceRegistrationController controller
	I0831 22:36:55.954480       1 shared_informer.go:320] Caches are synced for crd-autoregister
	I0831 22:36:55.954969       1 aggregator.go:171] initial CRD sync complete...
	I0831 22:36:55.955092       1 autoregister_controller.go:144] Starting autoregister controller
	I0831 22:36:55.955194       1 cache.go:32] Waiting for caches to sync for autoregister controller
	I0831 22:36:55.955309       1 cache.go:39] Caches are synced for autoregister controller
	I0831 22:36:55.957677       1 handler_discovery.go:450] Starting ResourceDiscoveryManager
	W0831 22:36:55.960494       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.169.0.6]
	I0831 22:36:55.974621       1 shared_informer.go:320] Caches are synced for configmaps
	I0831 22:36:55.982646       1 shared_informer.go:320] Caches are synced for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0831 22:36:55.982788       1 shared_informer.go:320] Caches are synced for node_authorizer
	I0831 22:36:55.982866       1 policy_source.go:224] refreshing policies
	I0831 22:36:55.990600       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0831 22:36:56.065496       1 controller.go:615] quota admission added evaluator for: endpoints
	I0831 22:36:56.078415       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	E0831 22:36:56.080666       1 controller.go:95] Found stale data, removed previous endpoints on kubernetes service, apiserver didn't exit successfully previously
	I0831 22:36:56.858259       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	W0831 22:36:57.190605       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.169.0.5]
	
	
	==> kube-apiserver [ffec6106be6c] <==
	W0831 22:36:02.115125       1 logging.go:55] [core] [Channel #73 SubChannel #74]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.115222       1 logging.go:55] [core] [Channel #127 SubChannel #128]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.115245       1 logging.go:55] [core] [Channel #22 SubChannel #23]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.115261       1 logging.go:55] [core] [Channel #1 SubChannel #3]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.115276       1 logging.go:55] [core] [Channel #133 SubChannel #134]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119407       1 logging.go:55] [core] [Channel #157 SubChannel #158]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119539       1 logging.go:55] [core] [Channel #115 SubChannel #116]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119557       1 logging.go:55] [core] [Channel #46 SubChannel #47]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119573       1 logging.go:55] [core] [Channel #55 SubChannel #56]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119587       1 logging.go:55] [core] [Channel #25 SubChannel #26]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119602       1 logging.go:55] [core] [Channel #82 SubChannel #83]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119655       1 logging.go:55] [core] [Channel #79 SubChannel #80]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119675       1 logging.go:55] [core] [Channel #88 SubChannel #89]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119696       1 logging.go:55] [core] [Channel #94 SubChannel #95]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119711       1 logging.go:55] [core] [Channel #100 SubChannel #101]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119786       1 logging.go:55] [core] [Channel #130 SubChannel #131]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119813       1 logging.go:55] [core] [Channel #124 SubChannel #125]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119870       1 logging.go:55] [core] [Channel #76 SubChannel #77]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119955       1 logging.go:55] [core] [Channel #172 SubChannel #173]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.119994       1 logging.go:55] [core] [Channel #121 SubChannel #122]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.120283       1 logging.go:55] [core] [Channel #154 SubChannel #155]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.120304       1 logging.go:55] [core] [Channel #151 SubChannel #152]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.120414       1 logging.go:55] [core] [Channel #142 SubChannel #143]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.120438       1 logging.go:55] [core] [Channel #166 SubChannel #167]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 22:36:02.114925       1 logging.go:55] [core] [Channel #145 SubChannel #146]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	
	
	==> kube-controller-manager [3dd9e3bd3e1f] <==
	I0831 22:37:40.196435       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="20.594737ms"
	I0831 22:37:40.196800       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="175.922µs"
	I0831 22:37:54.687068       1 endpointslice_controller.go:344] "Error syncing endpoint slices for service, retrying" logger="endpointslice-controller" key="kube-system/kube-dns" err="failed to update kube-dns-mxss9 EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io \"kube-dns-mxss9\": the object has been modified; please apply your changes to the latest version and try again"
	I0831 22:37:54.687554       1 event.go:377] Event(v1.ObjectReference{Kind:"Service", Namespace:"kube-system", Name:"kube-dns", UID:"c225b6ce-9d24-451b-aa4c-2f6d57886b05", APIVersion:"v1", ResourceVersion:"257", FieldPath:""}): type: 'Warning' reason: 'FailedToUpdateEndpointSlices' Error updating Endpoint Slices for Service kube-system/kube-dns: failed to update kube-dns-mxss9 EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io "kube-dns-mxss9": the object has been modified; please apply your changes to the latest version and try again
	I0831 22:37:54.697104       1 endpointslice_controller.go:344] "Error syncing endpoint slices for service, retrying" logger="endpointslice-controller" key="kube-system/kube-dns" err="failed to update kube-dns-mxss9 EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io \"kube-dns-mxss9\": the object has been modified; please apply your changes to the latest version and try again"
	I0831 22:37:54.697155       1 event.go:377] Event(v1.ObjectReference{Kind:"Service", Namespace:"kube-system", Name:"kube-dns", UID:"c225b6ce-9d24-451b-aa4c-2f6d57886b05", APIVersion:"v1", ResourceVersion:"257", FieldPath:""}): type: 'Warning' reason: 'FailedToUpdateEndpointSlices' Error updating Endpoint Slices for Service kube-system/kube-dns: failed to update kube-dns-mxss9 EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io "kube-dns-mxss9": the object has been modified; please apply your changes to the latest version and try again
	I0831 22:37:54.698321       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-6f6b679f8f" duration="73.860325ms"
	E0831 22:37:54.698593       1 replica_set.go:560] "Unhandled Error" err="sync \"kube-system/coredns-6f6b679f8f\" failed with Operation cannot be fulfilled on replicasets.apps \"coredns-6f6b679f8f\": the object has been modified; please apply your changes to the latest version and try again" logger="UnhandledError"
	I0831 22:37:54.701342       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-6f6b679f8f" duration="61.894µs"
	I0831 22:37:54.706798       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-6f6b679f8f" duration="103.162µs"
	I0831 22:42:02.055055       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m02"
	I0831 22:42:12.841493       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000"
	I0831 22:42:25.761201       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:42:25.773495       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	I0831 22:42:25.818885       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="27.782008ms"
	I0831 22:42:25.856648       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="37.719766ms"
	I0831 22:42:25.876885       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="19.524845ms"
	I0831 22:42:25.897572       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="20.390236ms"
	I0831 22:42:25.897845       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="44.94µs"
	I0831 22:42:25.922101       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="21.27168ms"
	I0831 22:42:25.924198       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="2.049093ms"
	I0831 22:42:27.932215       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="22.756µs"
	I0831 22:42:28.353417       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="180.672µs"
	I0831 22:42:28.355718       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="31.225µs"
	I0831 22:42:29.749191       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m03"
	
	
	==> kube-controller-manager [740de9cc660e] <==
	I0831 22:36:36.160199       1 serving.go:386] Generated self-signed cert in-memory
	I0831 22:36:36.406066       1 controllermanager.go:197] "Starting" version="v1.31.0"
	I0831 22:36:36.406213       1 controllermanager.go:199] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:36:36.407965       1 dynamic_cafile_content.go:160] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0831 22:36:36.408151       1 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I0831 22:36:36.408699       1 secure_serving.go:213] Serving securely on 127.0.0.1:10257
	I0831 22:36:36.408792       1 tlsconfig.go:243] "Starting DynamicServingCertificateController"
	E0831 22:36:56.415496       1 controllermanager.go:242] "Error building controller context" err="failed to wait for apiserver being healthy: timed out waiting for the condition: failed to get apiserver /healthz status: an error on the server (\"[+]ping ok\\n[+]log ok\\n[+]etcd ok\\n[+]poststarthook/start-apiserver-admission-initializer ok\\n[+]poststarthook/generic-apiserver-start-informers ok\\n[+]poststarthook/priority-and-fairness-config-consumer ok\\n[+]poststarthook/priority-and-fairness-filter ok\\n[+]poststarthook/storage-object-count-tracker-hook ok\\n[+]poststarthook/start-apiextensions-informers ok\\n[+]poststarthook/start-apiextensions-controllers ok\\n[+]poststarthook/crd-informer-synced ok\\n[+]poststarthook/start-system-namespaces-controller ok\\n[+]poststarthook/start-cluster-authentication-info-controller ok\\n[+]poststarthook/start-kube-apiserver-identity-lease-controller ok\\n[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok\\n[+]poststarthook/start-legacy-to
ken-tracking-controller ok\\n[+]poststarthook/start-service-ip-repair-controllers ok\\n[-]poststarthook/rbac/bootstrap-roles failed: reason withheld\\n[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld\\n[+]poststarthook/priority-and-fairness-config-producer ok\\n[+]poststarthook/bootstrap-controller ok\\n[+]poststarthook/aggregator-reload-proxy-client-cert ok\\n[+]poststarthook/start-kube-aggregator-informers ok\\n[+]poststarthook/apiservice-status-local-available-controller ok\\n[+]poststarthook/apiservice-status-remote-available-controller ok\\n[+]poststarthook/apiservice-registration-controller ok\\n[+]poststarthook/apiservice-discovery-controller ok\\n[+]poststarthook/kube-apiserver-autoregistration ok\\n[+]autoregister-completion ok\\n[+]poststarthook/apiservice-openapi-controller ok\\n[+]poststarthook/apiservice-openapiv3-controller ok\\nhealthz check failed\") has prevented the request from succeeding"
	
	
	==> kube-proxy [54d5f8041c89] <==
		add table ip kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	E0831 22:29:49.977338       1 proxier.go:734] "Error cleaning up nftables rules" err=<
		could not run nftables command: /dev/stdin:1:1-25: Error: Could not process rule: Operation not supported
		add table ip6 kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	I0831 22:29:49.983071       1 server.go:677] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	E0831 22:29:49.983430       1 server.go:234] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I0831 22:29:50.023032       1 server_linux.go:146] "No iptables support for family" ipFamily="IPv6"
	I0831 22:29:50.023054       1 server.go:245] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0831 22:29:50.023070       1 server_linux.go:169] "Using iptables Proxier"
	I0831 22:29:50.025790       1 proxier.go:255] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I0831 22:29:50.026014       1 server.go:483] "Version info" version="v1.31.0"
	I0831 22:29:50.026061       1 server.go:485] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:29:50.026844       1 config.go:197] "Starting service config controller"
	I0831 22:29:50.027602       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0831 22:29:50.027141       1 config.go:104] "Starting endpoint slice config controller"
	I0831 22:29:50.027698       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0831 22:29:50.027260       1 config.go:326] "Starting node config controller"
	I0831 22:29:50.027720       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0831 22:29:50.128122       1 shared_informer.go:320] Caches are synced for node config
	I0831 22:29:50.128144       1 shared_informer.go:320] Caches are synced for service config
	I0831 22:29:50.128162       1 shared_informer.go:320] Caches are synced for endpoint slice config
	
	
	==> kube-proxy [f89b86206413] <==
		add table ip kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	E0831 22:37:16.195275       1 proxier.go:734] "Error cleaning up nftables rules" err=<
		could not run nftables command: /dev/stdin:1:1-25: Error: Could not process rule: Operation not supported
		add table ip6 kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	I0831 22:37:16.220357       1 server.go:677] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	E0831 22:37:16.220590       1 server.go:234] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I0831 22:37:16.265026       1 server_linux.go:146] "No iptables support for family" ipFamily="IPv6"
	I0831 22:37:16.265177       1 server.go:245] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0831 22:37:16.265305       1 server_linux.go:169] "Using iptables Proxier"
	I0831 22:37:16.268348       1 proxier.go:255] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I0831 22:37:16.268734       1 server.go:483] "Version info" version="v1.31.0"
	I0831 22:37:16.269061       1 server.go:485] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:37:16.272514       1 config.go:197] "Starting service config controller"
	I0831 22:37:16.273450       1 config.go:104] "Starting endpoint slice config controller"
	I0831 22:37:16.273658       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0831 22:37:16.273777       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0831 22:37:16.275413       1 config.go:326] "Starting node config controller"
	I0831 22:37:16.277042       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0831 22:37:16.374257       1 shared_informer.go:320] Caches are synced for service config
	I0831 22:37:16.375624       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0831 22:37:16.377606       1 shared_informer.go:320] Caches are synced for node config
	
	
	==> kube-scheduler [02c10e4f765d] <==
	E0831 22:29:42.107231       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicasets\" in API group \"apps\" at the cluster scope" logger="UnhandledError"
	W0831 22:29:42.111966       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	E0831 22:29:42.112045       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumes\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:29:42.116498       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	E0831 22:29:42.116539       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Namespace: failed to list *v1.Namespace: namespaces is forbidden: User \"system:kube-scheduler\" cannot list resource \"namespaces\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:29:42.129701       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0831 22:29:42.129741       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumeclaims\" in API group \"\" at the cluster scope" logger="UnhandledError"
	I0831 22:29:45.342252       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	E0831 22:31:50.464567       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kube-proxy-d45q5\": pod kube-proxy-d45q5 is already assigned to node \"ha-949000-m03\"" plugin="DefaultBinder" pod="kube-system/kube-proxy-d45q5" node="ha-949000-m03"
	E0831 22:31:50.464652       1 schedule_one.go:348] "scheduler cache ForgetPod failed" err="pod 9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2(kube-system/kube-proxy-d45q5) wasn't assumed so cannot be forgotten" pod="kube-system/kube-proxy-d45q5"
	E0831 22:31:50.464667       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kube-proxy-d45q5\": pod kube-proxy-d45q5 is already assigned to node \"ha-949000-m03\"" pod="kube-system/kube-proxy-d45q5"
	I0831 22:31:50.464683       1 schedule_one.go:1070] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kube-proxy-d45q5" node="ha-949000-m03"
	E0831 22:31:50.476710       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kindnet-l4zbh\": pod kindnet-l4zbh is already assigned to node \"ha-949000-m03\"" plugin="DefaultBinder" pod="kube-system/kindnet-l4zbh" node="ha-949000-m03"
	E0831 22:31:50.476756       1 schedule_one.go:348] "scheduler cache ForgetPod failed" err="pod c551bb18-9a7d-4fca-9724-be7900980a40(kube-system/kindnet-l4zbh) wasn't assumed so cannot be forgotten" pod="kube-system/kindnet-l4zbh"
	E0831 22:31:50.476767       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kindnet-l4zbh\": pod kindnet-l4zbh is already assigned to node \"ha-949000-m03\"" pod="kube-system/kindnet-l4zbh"
	I0831 22:31:50.476781       1 schedule_one.go:1070] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kindnet-l4zbh" node="ha-949000-m03"
	E0831 22:32:20.049491       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-6r9s5\": pod busybox-7dff88458-6r9s5 is already assigned to node \"ha-949000-m02\"" plugin="DefaultBinder" pod="default/busybox-7dff88458-6r9s5" node="ha-949000-m02"
	E0831 22:32:20.049618       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-6r9s5\": pod busybox-7dff88458-6r9s5 is already assigned to node \"ha-949000-m02\"" pod="default/busybox-7dff88458-6r9s5"
	E0831 22:32:20.071235       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-vjf9x\": pod busybox-7dff88458-vjf9x is already assigned to node \"ha-949000-m03\"" plugin="DefaultBinder" pod="default/busybox-7dff88458-vjf9x" node="ha-949000-m03"
	E0831 22:32:20.071466       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-vjf9x\": pod busybox-7dff88458-vjf9x is already assigned to node \"ha-949000-m03\"" pod="default/busybox-7dff88458-vjf9x"
	E0831 22:32:20.073498       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-5kkbw\": pod busybox-7dff88458-5kkbw is already assigned to node \"ha-949000\"" plugin="DefaultBinder" pod="default/busybox-7dff88458-5kkbw" node="ha-949000"
	E0831 22:32:20.073571       1 schedule_one.go:348] "scheduler cache ForgetPod failed" err="pod e97e21d8-a69e-451c-babd-6232e12aafe0(default/busybox-7dff88458-5kkbw) wasn't assumed so cannot be forgotten" pod="default/busybox-7dff88458-5kkbw"
	E0831 22:32:20.077323       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-7dff88458-5kkbw\": pod busybox-7dff88458-5kkbw is already assigned to node \"ha-949000\"" pod="default/busybox-7dff88458-5kkbw"
	I0831 22:32:20.077394       1 schedule_one.go:1070] "Pod has been assigned to node. Abort adding it back to queue." pod="default/busybox-7dff88458-5kkbw" node="ha-949000"
	E0831 22:36:01.972805       1 run.go:72] "command failed" err="finished without leader elect"
	
	
	==> kube-scheduler [5b0ac6b7faf7] <==
	I0831 22:36:35.937574       1 serving.go:386] Generated self-signed cert in-memory
	W0831 22:36:46.491998       1 authentication.go:370] Error looking up in-cluster authentication configuration: Get "https://192.169.0.5:8443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication": net/http: TLS handshake timeout
	W0831 22:36:46.492020       1 authentication.go:371] Continuing without authentication configuration. This may treat all requests as anonymous.
	W0831 22:36:46.492025       1 authentication.go:372] To require authentication configuration lookup to succeed, set --authentication-tolerate-lookup-failure=false
	I0831 22:36:55.901677       1 server.go:167] "Starting Kubernetes Scheduler" version="v1.31.0"
	I0831 22:36:55.901714       1 server.go:169] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:36:55.904943       1 configmap_cafile_content.go:205] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I0831 22:36:55.905195       1 secure_serving.go:213] Serving securely on 127.0.0.1:10259
	I0831 22:36:55.905729       1 tlsconfig.go:243] "Starting DynamicServingCertificateController"
	I0831 22:36:55.906036       1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I0831 22:36:56.006746       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Aug 31 22:38:28 ha-949000 kubelet[1583]: E0831 22:38:28.334569    1583 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:38:28 ha-949000 kubelet[1583]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:38:28 ha-949000 kubelet[1583]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:38:28 ha-949000 kubelet[1583]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:38:28 ha-949000 kubelet[1583]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:39:28 ha-949000 kubelet[1583]: E0831 22:39:28.333827    1583 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:39:28 ha-949000 kubelet[1583]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:39:28 ha-949000 kubelet[1583]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:39:28 ha-949000 kubelet[1583]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:39:28 ha-949000 kubelet[1583]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:40:28 ha-949000 kubelet[1583]: E0831 22:40:28.335276    1583 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:40:28 ha-949000 kubelet[1583]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:40:28 ha-949000 kubelet[1583]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:40:28 ha-949000 kubelet[1583]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:40:28 ha-949000 kubelet[1583]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:41:28 ha-949000 kubelet[1583]: E0831 22:41:28.333999    1583 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:41:28 ha-949000 kubelet[1583]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:41:28 ha-949000 kubelet[1583]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:41:28 ha-949000 kubelet[1583]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:41:28 ha-949000 kubelet[1583]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:42:28 ha-949000 kubelet[1583]: E0831 22:42:28.334539    1583 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:42:28 ha-949000 kubelet[1583]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:42:28 ha-949000 kubelet[1583]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:42:28 ha-949000 kubelet[1583]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:42:28 ha-949000 kubelet[1583]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:255: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p ha-949000 -n ha-949000
helpers_test.go:262: (dbg) Run:  kubectl --context ha-949000 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:273: non-running pods: busybox-7dff88458-g8b59
helpers_test.go:275: ======> post-mortem[TestMultiControlPlane/serial/DeleteSecondaryNode]: describe non-running pods <======
helpers_test.go:278: (dbg) Run:  kubectl --context ha-949000 describe pod busybox-7dff88458-g8b59
helpers_test.go:283: (dbg) kubectl --context ha-949000 describe pod busybox-7dff88458-g8b59:

                                                
                                                
-- stdout --
	Name:             busybox-7dff88458-g8b59
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             <none>
	Labels:           app=busybox
	                  pod-template-hash=7dff88458
	Annotations:      <none>
	Status:           Pending
	IP:               
	IPs:              <none>
	Controlled By:    ReplicaSet/busybox-7dff88458
	Containers:
	  busybox:
	    Image:      gcr.io/k8s-minikube/busybox:1.28
	    Port:       <none>
	    Host Port:  <none>
	    Command:
	      sleep
	      3600
	    Environment:  <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-jmpb5 (ro)
	Conditions:
	  Type           Status
	  PodScheduled   False 
	Volumes:
	  kube-api-access-jmpb5:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    ConfigMapOptional:       <nil>
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason            Age               From               Message
	  ----     ------            ----              ----               -------
	  Warning  FailedScheduling  10s               default-scheduler  0/3 nodes are available: 1 node(s) were unschedulable, 2 node(s) didn't match pod anti-affinity rules. preemption: 0/3 nodes are available: 1 Preemption is not helpful for scheduling, 2 No preemption victims found for incoming pod.
	  Warning  FailedScheduling  8s (x2 over 10s)  default-scheduler  0/3 nodes are available: 1 node(s) were unschedulable, 2 node(s) didn't match pod anti-affinity rules. preemption: 0/3 nodes are available: 1 Preemption is not helpful for scheduling, 2 No preemption victims found for incoming pod.
	  Warning  FailedScheduling  9s (x2 over 11s)  default-scheduler  0/3 nodes are available: 1 node(s) were unschedulable, 2 node(s) didn't match pod anti-affinity rules. preemption: 0/3 nodes are available: 1 Preemption is not helpful for scheduling, 2 No preemption victims found for incoming pod.
	  Warning  FailedScheduling  9s (x2 over 11s)  default-scheduler  0/3 nodes are available: 1 node(s) were unschedulable, 2 node(s) didn't match pod anti-affinity rules. preemption: 0/3 nodes are available: 1 Preemption is not helpful for scheduling, 2 No preemption victims found for incoming pod.

                                                
                                                
-- /stdout --
helpers_test.go:286: <<< TestMultiControlPlane/serial/DeleteSecondaryNode FAILED: end of post-mortem logs <<<
helpers_test.go:287: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/DeleteSecondaryNode (11.50s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartCluster (377.78s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartCluster
ha_test.go:560: (dbg) Run:  out/minikube-darwin-amd64 start -p ha-949000 --wait=true -v=7 --alsologtostderr --driver=hyperkit 
E0831 15:44:15.448471    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:47:52.707738    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
ha_test.go:560: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p ha-949000 --wait=true -v=7 --alsologtostderr --driver=hyperkit : exit status 80 (6m13.137560115s)

                                                
                                                
-- stdout --
	* [ha-949000] minikube v1.33.1 on Darwin 14.6.1
	  - MINIKUBE_LOCATION=18943
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on existing profile
	* Starting "ha-949000" primary control-plane node in "ha-949000" cluster
	* Restarting existing hyperkit VM for "ha-949000" ...
	* Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	* Enabled addons: 
	
	* Starting "ha-949000-m02" control-plane node in "ha-949000" cluster
	* Restarting existing hyperkit VM for "ha-949000-m02" ...
	* Found network options:
	  - NO_PROXY=192.169.0.5
	* Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	  - env NO_PROXY=192.169.0.5
	* Verifying Kubernetes components...
	
	* Starting "ha-949000-m04" worker node in "ha-949000" cluster
	* Restarting existing hyperkit VM for "ha-949000-m04" ...
	* Found network options:
	  - NO_PROXY=192.169.0.5,192.169.0.6
	* Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	  - env NO_PROXY=192.169.0.5
	  - env NO_PROXY=192.169.0.5,192.169.0.6
	* Verifying Kubernetes components...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 15:42:55.897896    4003 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:42:55.898177    4003 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:42:55.898183    4003 out.go:358] Setting ErrFile to fd 2...
	I0831 15:42:55.898187    4003 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:42:55.898378    4003 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:42:55.899837    4003 out.go:352] Setting JSON to false
	I0831 15:42:55.921901    4003 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":2546,"bootTime":1725141629,"procs":434,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.6.1","kernelVersion":"23.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0831 15:42:55.922001    4003 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0831 15:42:55.944577    4003 out.go:177] * [ha-949000] minikube v1.33.1 on Darwin 14.6.1
	I0831 15:42:55.987096    4003 out.go:177]   - MINIKUBE_LOCATION=18943
	I0831 15:42:55.987175    4003 notify.go:220] Checking for updates...
	I0831 15:42:56.029932    4003 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:42:56.050856    4003 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0831 15:42:56.072033    4003 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0831 15:42:56.093103    4003 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:42:56.114053    4003 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0831 15:42:56.135758    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:42:56.136428    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:56.136520    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:42:56.146197    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52047
	I0831 15:42:56.146589    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:42:56.146991    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:42:56.147003    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:42:56.147207    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:42:56.147336    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:42:56.147526    4003 driver.go:392] Setting default libvirt URI to qemu:///system
	I0831 15:42:56.147753    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:56.147780    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:42:56.156287    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52049
	I0831 15:42:56.156619    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:42:56.156971    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:42:56.156990    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:42:56.157191    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:42:56.157316    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:42:56.186031    4003 out.go:177] * Using the hyperkit driver based on existing profile
	I0831 15:42:56.227918    4003 start.go:297] selected driver: hyperkit
	I0831 15:42:56.227945    4003 start.go:901] validating driver "hyperkit" against &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime: ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false
ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirro
r: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:42:56.228199    4003 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0831 15:42:56.228401    4003 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:42:56.228599    4003 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18943-957/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0831 15:42:56.238336    4003 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0831 15:42:56.242056    4003 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:56.242078    4003 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0831 15:42:56.244705    4003 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:42:56.244747    4003 cni.go:84] Creating CNI manager for ""
	I0831 15:42:56.244753    4003 cni.go:136] multinode detected (3 nodes found), recommending kindnet
	I0831 15:42:56.244827    4003 start.go:340] cluster config:
	{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.16
9.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false
kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: S
ocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:42:56.244921    4003 iso.go:125] acquiring lock: {Name:mk6e91575b208577856769ef01f8e000bc57c787 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:42:56.286816    4003 out.go:177] * Starting "ha-949000" primary control-plane node in "ha-949000" cluster
	I0831 15:42:56.307847    4003 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:42:56.307937    4003 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4
	I0831 15:42:56.307963    4003 cache.go:56] Caching tarball of preloaded images
	I0831 15:42:56.308209    4003 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:42:56.308229    4003 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:42:56.308418    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:42:56.309323    4003 start.go:360] acquireMachinesLock for ha-949000: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:42:56.309437    4003 start.go:364] duration metric: took 90.572µs to acquireMachinesLock for "ha-949000"
	I0831 15:42:56.309468    4003 start.go:96] Skipping create...Using existing machine configuration
	I0831 15:42:56.309488    4003 fix.go:54] fixHost starting: 
	I0831 15:42:56.309922    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:56.309949    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:42:56.318888    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52051
	I0831 15:42:56.319241    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:42:56.319612    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:42:56.319626    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:42:56.319866    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:42:56.320016    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:42:56.320133    4003 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:42:56.320226    4003 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:42:56.320300    4003 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 3756
	I0831 15:42:56.321264    4003 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid 3756 missing from process table
	I0831 15:42:56.321288    4003 fix.go:112] recreateIfNeeded on ha-949000: state=Stopped err=<nil>
	I0831 15:42:56.321305    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	W0831 15:42:56.321391    4003 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 15:42:56.363717    4003 out.go:177] * Restarting existing hyperkit VM for "ha-949000" ...
	I0831 15:42:56.384899    4003 main.go:141] libmachine: (ha-949000) Calling .Start
	I0831 15:42:56.385294    4003 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:42:56.385370    4003 main.go:141] libmachine: (ha-949000) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid
	I0831 15:42:56.387089    4003 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid 3756 missing from process table
	I0831 15:42:56.387099    4003 main.go:141] libmachine: (ha-949000) DBG | pid 3756 is in state "Stopped"
	I0831 15:42:56.387115    4003 main.go:141] libmachine: (ha-949000) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid...
	I0831 15:42:56.387550    4003 main.go:141] libmachine: (ha-949000) DBG | Using UUID 98cab9ba-901d-49d1-9e6c-321a4533d56e
	I0831 15:42:56.496381    4003 main.go:141] libmachine: (ha-949000) DBG | Generated MAC ce:8:77:f7:42:5e
	I0831 15:42:56.496404    4003 main.go:141] libmachine: (ha-949000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:42:56.496533    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"98cab9ba-901d-49d1-9e6c-321a4533d56e", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003834d0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:42:56.496559    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"98cab9ba-901d-49d1-9e6c-321a4533d56e", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003834d0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:42:56.496621    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "98cab9ba-901d-49d1-9e6c-321a4533d56e", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd,earlyprintk=serial l
oglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:42:56.496665    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 98cab9ba-901d-49d1-9e6c-321a4533d56e -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset noresto
re waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:42:56.496684    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:42:56.498385    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 DEBUG: hyperkit: Pid is 4017
	I0831 15:42:56.498816    4003 main.go:141] libmachine: (ha-949000) DBG | Attempt 0
	I0831 15:42:56.498834    4003 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:42:56.498897    4003 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 4017
	I0831 15:42:56.500466    4003 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:42:56.500539    4003 main.go:141] libmachine: (ha-949000) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0831 15:42:56.500570    4003 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39c5e}
	I0831 15:42:56.500583    4003 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 15:42:56.500598    4003 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ec75}
	I0831 15:42:56.500613    4003 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4ec63}
	I0831 15:42:56.500643    4003 main.go:141] libmachine: (ha-949000) DBG | Found match: ce:8:77:f7:42:5e
	I0831 15:42:56.500654    4003 main.go:141] libmachine: (ha-949000) DBG | IP: 192.169.0.5
	I0831 15:42:56.500687    4003 main.go:141] libmachine: (ha-949000) Calling .GetConfigRaw
	I0831 15:42:56.501361    4003 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:42:56.501546    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:42:56.501931    4003 machine.go:93] provisionDockerMachine start ...
	I0831 15:42:56.501942    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:42:56.502103    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:42:56.502225    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:42:56.502347    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:42:56.502457    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:42:56.502550    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:42:56.502680    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:42:56.502894    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:42:56.502905    4003 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 15:42:56.506309    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:42:56.558516    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:42:56.559184    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:42:56.559207    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:42:56.559278    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:42:56.559308    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:42:56.940245    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:42:56.940260    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:42:57.055064    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:42:57.055080    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:42:57.055092    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:42:57.055101    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:42:57.056061    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:57 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:42:57.056073    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:57 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:43:02.655390    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:43:02 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0831 15:43:02.655429    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:43:02 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0831 15:43:02.655438    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:43:02 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0831 15:43:02.679403    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:43:02 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0831 15:43:07.568442    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 15:43:07.568456    4003 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:43:07.568651    4003 buildroot.go:166] provisioning hostname "ha-949000"
	I0831 15:43:07.568662    4003 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:43:07.568760    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:07.568847    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:07.568962    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:07.569093    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:07.569187    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:07.569365    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:07.569534    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:43:07.569549    4003 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000 && echo "ha-949000" | sudo tee /etc/hostname
	I0831 15:43:07.639291    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000
	
	I0831 15:43:07.639309    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:07.639436    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:07.639557    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:07.639638    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:07.639737    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:07.639874    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:07.640074    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:43:07.640086    4003 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:43:07.704134    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:43:07.704155    4003 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:43:07.704172    4003 buildroot.go:174] setting up certificates
	I0831 15:43:07.704178    4003 provision.go:84] configureAuth start
	I0831 15:43:07.704186    4003 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:43:07.704317    4003 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:43:07.704420    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:07.704522    4003 provision.go:143] copyHostCerts
	I0831 15:43:07.704550    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:43:07.704624    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:43:07.704632    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:43:07.704768    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:43:07.704971    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:43:07.705012    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:43:07.705016    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:43:07.705108    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:43:07.705254    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:43:07.705294    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:43:07.705299    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:43:07.705382    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:43:07.705569    4003 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000 san=[127.0.0.1 192.169.0.5 ha-949000 localhost minikube]
	I0831 15:43:07.906186    4003 provision.go:177] copyRemoteCerts
	I0831 15:43:07.906273    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:43:07.906312    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:07.906550    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:07.906738    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:07.906937    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:07.907046    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:43:07.944033    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:43:07.944107    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:43:07.963419    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:43:07.963482    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0831 15:43:07.982821    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:43:07.982884    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0831 15:43:08.001703    4003 provision.go:87] duration metric: took 297.505228ms to configureAuth
	I0831 15:43:08.001714    4003 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:43:08.001892    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:43:08.001909    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:08.002046    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:08.002137    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:08.002225    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:08.002306    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:08.002382    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:08.002501    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:08.002650    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:43:08.002659    4003 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:43:08.059324    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:43:08.059336    4003 buildroot.go:70] root file system type: tmpfs
	I0831 15:43:08.059403    4003 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:43:08.059416    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:08.059551    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:08.059659    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:08.059758    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:08.059843    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:08.059967    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:08.060104    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:43:08.060148    4003 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:43:08.127622    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:43:08.127643    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:08.127795    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:08.127885    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:08.127986    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:08.128093    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:08.128219    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:08.128373    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:43:08.128385    4003 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:43:09.818482    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:43:09.818495    4003 machine.go:96] duration metric: took 13.316412951s to provisionDockerMachine
	I0831 15:43:09.818507    4003 start.go:293] postStartSetup for "ha-949000" (driver="hyperkit")
	I0831 15:43:09.818514    4003 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:43:09.818524    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:09.818708    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:43:09.818733    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:09.818845    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:09.818952    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:09.819031    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:09.819124    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:43:09.856201    4003 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:43:09.861552    4003 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:43:09.861568    4003 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:43:09.861690    4003 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:43:09.861873    4003 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:43:09.861880    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:43:09.862086    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:43:09.873444    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:43:09.903949    4003 start.go:296] duration metric: took 85.422286ms for postStartSetup
	I0831 15:43:09.903973    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:09.904145    4003 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0831 15:43:09.904158    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:09.904244    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:09.904332    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:09.904406    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:09.904491    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:43:09.939732    4003 machine.go:197] restoring vm config from /var/lib/minikube/backup: [etc]
	I0831 15:43:09.939783    4003 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0831 15:43:09.973207    4003 fix.go:56] duration metric: took 13.663579156s for fixHost
	I0831 15:43:09.973228    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:09.973356    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:09.973449    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:09.973546    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:09.973619    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:09.973749    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:09.973922    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:43:09.973930    4003 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:43:10.027714    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725144190.095289778
	
	I0831 15:43:10.027726    4003 fix.go:216] guest clock: 1725144190.095289778
	I0831 15:43:10.027732    4003 fix.go:229] Guest: 2024-08-31 15:43:10.095289778 -0700 PDT Remote: 2024-08-31 15:43:09.973219 -0700 PDT m=+14.110517944 (delta=122.070778ms)
	I0831 15:43:10.027767    4003 fix.go:200] guest clock delta is within tolerance: 122.070778ms
	I0831 15:43:10.027774    4003 start.go:83] releasing machines lock for "ha-949000", held for 13.718178323s
	I0831 15:43:10.027798    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:10.027932    4003 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:43:10.028026    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:10.028324    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:10.028419    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:10.028500    4003 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:43:10.028533    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:10.028579    4003 ssh_runner.go:195] Run: cat /version.json
	I0831 15:43:10.028591    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:10.028629    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:10.028705    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:10.028719    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:10.028882    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:10.028892    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:10.028975    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:10.028990    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:43:10.029049    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:43:10.106178    4003 ssh_runner.go:195] Run: systemctl --version
	I0831 15:43:10.111111    4003 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0831 15:43:10.115308    4003 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:43:10.115344    4003 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:43:10.127805    4003 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:43:10.127827    4003 start.go:495] detecting cgroup driver to use...
	I0831 15:43:10.127920    4003 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:43:10.145626    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:43:10.154624    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:43:10.163250    4003 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:43:10.163290    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:43:10.172090    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:43:10.180802    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:43:10.189726    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:43:10.198477    4003 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:43:10.207531    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:43:10.216228    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:43:10.224957    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:43:10.233724    4003 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:43:10.241776    4003 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:43:10.249895    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:10.347162    4003 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:43:10.365744    4003 start.go:495] detecting cgroup driver to use...
	I0831 15:43:10.365818    4003 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:43:10.378577    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:43:10.391840    4003 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:43:10.407333    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:43:10.418578    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:43:10.428427    4003 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:43:10.447942    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:43:10.460400    4003 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:43:10.475459    4003 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:43:10.478281    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:43:10.485396    4003 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:43:10.498761    4003 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:43:10.593460    4003 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:43:10.696411    4003 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:43:10.696483    4003 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:43:10.710317    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:10.803031    4003 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:43:13.157366    4003 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.354290436s)
	I0831 15:43:13.157446    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:43:13.167970    4003 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:43:13.180929    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:43:13.191096    4003 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:43:13.293424    4003 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:43:13.392743    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:13.483508    4003 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:43:13.497374    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:43:13.508419    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:13.608347    4003 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:43:13.667376    4003 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:43:13.667470    4003 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:43:13.671956    4003 start.go:563] Will wait 60s for crictl version
	I0831 15:43:13.672004    4003 ssh_runner.go:195] Run: which crictl
	I0831 15:43:13.675617    4003 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:43:13.702050    4003 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:43:13.702122    4003 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:43:13.720302    4003 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:43:13.762901    4003 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:43:13.762952    4003 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:43:13.763326    4003 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:43:13.768068    4003 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:43:13.778798    4003 kubeadm.go:883] updating cluster {Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.
0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:f
alse inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOpt
imizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0831 15:43:13.778877    4003 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:43:13.778928    4003 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 15:43:13.792562    4003 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240813-c6f155d6
	registry.k8s.io/kube-apiserver:v1.31.0
	registry.k8s.io/kube-controller-manager:v1.31.0
	registry.k8s.io/kube-scheduler:v1.31.0
	registry.k8s.io/kube-proxy:v1.31.0
	registry.k8s.io/etcd:3.5.15-0
	registry.k8s.io/pause:3.10
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/coredns/coredns:v1.11.1
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0831 15:43:13.792576    4003 docker.go:615] Images already preloaded, skipping extraction
	I0831 15:43:13.792671    4003 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 15:43:13.806816    4003 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240813-c6f155d6
	registry.k8s.io/kube-controller-manager:v1.31.0
	registry.k8s.io/kube-apiserver:v1.31.0
	registry.k8s.io/kube-scheduler:v1.31.0
	registry.k8s.io/kube-proxy:v1.31.0
	registry.k8s.io/etcd:3.5.15-0
	registry.k8s.io/pause:3.10
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/coredns/coredns:v1.11.1
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0831 15:43:13.806831    4003 cache_images.go:84] Images are preloaded, skipping loading
	I0831 15:43:13.806842    4003 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.31.0 docker true true} ...
	I0831 15:43:13.806921    4003 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:43:13.806997    4003 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0831 15:43:13.845829    4003 cni.go:84] Creating CNI manager for ""
	I0831 15:43:13.845843    4003 cni.go:136] multinode detected (3 nodes found), recommending kindnet
	I0831 15:43:13.845854    4003 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0831 15:43:13.845869    4003 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.31.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-949000 NodeName:ha-949000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0831 15:43:13.845940    4003 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-949000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.31.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0831 15:43:13.845960    4003 kube-vip.go:115] generating kube-vip config ...
	I0831 15:43:13.846014    4003 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:43:13.859390    4003 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:43:13.859457    4003 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:43:13.859510    4003 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:43:13.867760    4003 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:43:13.867806    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0831 15:43:13.876386    4003 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0831 15:43:13.889628    4003 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:43:13.903120    4003 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0831 15:43:13.916765    4003 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:43:13.930236    4003 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:43:13.933264    4003 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:43:13.943217    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:14.038507    4003 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:43:14.052829    4003 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.5
	I0831 15:43:14.052841    4003 certs.go:194] generating shared ca certs ...
	I0831 15:43:14.052850    4003 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:43:14.053024    4003 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:43:14.053101    4003 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:43:14.053114    4003 certs.go:256] generating profile certs ...
	I0831 15:43:14.053197    4003 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:43:14.053222    4003 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.43b6ffe0
	I0831 15:43:14.053237    4003 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.43b6ffe0 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.254]
	I0831 15:43:14.128581    4003 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.43b6ffe0 ...
	I0831 15:43:14.128599    4003 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.43b6ffe0: {Name:mk00e438b52db2444ba8ce93d114dacf50fb7384 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:43:14.129258    4003 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.43b6ffe0 ...
	I0831 15:43:14.129272    4003 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.43b6ffe0: {Name:mkd10daf9fa17e10453b3bbf65f5132bb9bcd577 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:43:14.129503    4003 certs.go:381] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.43b6ffe0 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt
	I0831 15:43:14.129738    4003 certs.go:385] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.43b6ffe0 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key
	I0831 15:43:14.129977    4003 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:43:14.129987    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:43:14.130020    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:43:14.130040    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:43:14.130058    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:43:14.130075    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:43:14.130093    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:43:14.130110    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:43:14.130128    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:43:14.130233    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:43:14.130284    4003 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:43:14.130292    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:43:14.130322    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:43:14.130355    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:43:14.130384    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:43:14.130447    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:43:14.130483    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:14.130504    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:43:14.130522    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:43:14.131005    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:43:14.153234    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:43:14.186923    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:43:14.229049    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:43:14.284589    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1432 bytes)
	I0831 15:43:14.334141    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0831 15:43:14.385269    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:43:14.429545    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:43:14.461048    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:43:14.494719    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:43:14.523624    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:43:14.557563    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0831 15:43:14.571298    4003 ssh_runner.go:195] Run: openssl version
	I0831 15:43:14.575654    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:43:14.584028    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:43:14.587453    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:43:14.587495    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:43:14.591803    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:43:14.600098    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:43:14.608239    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:43:14.611660    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:43:14.611694    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:43:14.615930    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:43:14.624111    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:43:14.632509    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:14.636012    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:14.636052    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:14.640278    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:43:14.648758    4003 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:43:14.652057    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0831 15:43:14.656418    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0831 15:43:14.660743    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0831 15:43:14.665063    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0831 15:43:14.669321    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0831 15:43:14.673568    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0831 15:43:14.677784    4003 kubeadm.go:392] StartCluster: {Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 C
lusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:fals
e inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimi
zations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:43:14.677912    4003 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0831 15:43:14.690883    4003 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0831 15:43:14.698384    4003 kubeadm.go:408] found existing configuration files, will attempt cluster restart
	I0831 15:43:14.698396    4003 kubeadm.go:593] restartPrimaryControlPlane start ...
	I0831 15:43:14.698441    4003 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0831 15:43:14.706022    4003 kubeadm.go:130] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:43:14.706313    4003 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-949000" does not appear in /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:43:14.706401    4003 kubeconfig.go:62] /Users/jenkins/minikube-integration/18943-957/kubeconfig needs updating (will repair): [kubeconfig missing "ha-949000" cluster setting kubeconfig missing "ha-949000" context setting]
	I0831 15:43:14.706628    4003 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/kubeconfig: {Name:mkc7259a3f17d77b84078e55eed4ed8b5d2486ca Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:43:14.707280    4003 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:43:14.707484    4003 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.5:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, Use
rAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52edc00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0831 15:43:14.707808    4003 cert_rotation.go:140] Starting client certificate rotation controller
	I0831 15:43:14.707985    4003 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0831 15:43:14.715222    4003 kubeadm.go:630] The running cluster does not require reconfiguration: 192.169.0.5
	I0831 15:43:14.715234    4003 kubeadm.go:597] duration metric: took 16.834195ms to restartPrimaryControlPlane
	I0831 15:43:14.715240    4003 kubeadm.go:394] duration metric: took 37.459181ms to StartCluster
	I0831 15:43:14.715249    4003 settings.go:142] acquiring lock: {Name:mk4b1b0a7439feab82be8f6d66b4d3c4d11c9b5f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:43:14.715327    4003 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:43:14.715694    4003 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/kubeconfig: {Name:mkc7259a3f17d77b84078e55eed4ed8b5d2486ca Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:43:14.715917    4003 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:43:14.715930    4003 start.go:241] waiting for startup goroutines ...
	I0831 15:43:14.715938    4003 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0831 15:43:14.716058    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:43:14.761177    4003 out.go:177] * Enabled addons: 
	I0831 15:43:14.783218    4003 addons.go:510] duration metric: took 67.285233ms for enable addons: enabled=[]
	I0831 15:43:14.783269    4003 start.go:246] waiting for cluster config update ...
	I0831 15:43:14.783281    4003 start.go:255] writing updated cluster config ...
	I0831 15:43:14.806130    4003 out.go:201] 
	I0831 15:43:14.827581    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:43:14.827719    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:43:14.850202    4003 out.go:177] * Starting "ha-949000-m02" control-plane node in "ha-949000" cluster
	I0831 15:43:14.892085    4003 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:43:14.892153    4003 cache.go:56] Caching tarball of preloaded images
	I0831 15:43:14.892329    4003 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:43:14.892347    4003 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:43:14.892479    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:43:14.893510    4003 start.go:360] acquireMachinesLock for ha-949000-m02: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:43:14.893615    4003 start.go:364] duration metric: took 79.031µs to acquireMachinesLock for "ha-949000-m02"
	I0831 15:43:14.893640    4003 start.go:96] Skipping create...Using existing machine configuration
	I0831 15:43:14.893648    4003 fix.go:54] fixHost starting: m02
	I0831 15:43:14.894056    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:43:14.894083    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:43:14.903465    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52073
	I0831 15:43:14.903886    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:43:14.904288    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:43:14.904300    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:43:14.904593    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:43:14.904763    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:14.904931    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:43:14.905038    4003 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:43:14.905115    4003 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 3763
	I0831 15:43:14.906087    4003 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid 3763 missing from process table
	I0831 15:43:14.906133    4003 fix.go:112] recreateIfNeeded on ha-949000-m02: state=Stopped err=<nil>
	I0831 15:43:14.906161    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	W0831 15:43:14.906324    4003 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 15:43:14.949174    4003 out.go:177] * Restarting existing hyperkit VM for "ha-949000-m02" ...
	I0831 15:43:14.970157    4003 main.go:141] libmachine: (ha-949000-m02) Calling .Start
	I0831 15:43:14.970435    4003 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:43:14.970489    4003 main.go:141] libmachine: (ha-949000-m02) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid
	I0831 15:43:14.972233    4003 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid 3763 missing from process table
	I0831 15:43:14.972246    4003 main.go:141] libmachine: (ha-949000-m02) DBG | pid 3763 is in state "Stopped"
	I0831 15:43:14.972295    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid...
	I0831 15:43:14.972683    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Using UUID 23e5d675-5201-4f3d-86b7-b25c818528d1
	I0831 15:43:14.998998    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Generated MAC 92:7:3c:3f:ee:b7
	I0831 15:43:14.999027    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:43:14.999117    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:14 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"23e5d675-5201-4f3d-86b7-b25c818528d1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003bea80)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:43:14.999143    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:14 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"23e5d675-5201-4f3d-86b7-b25c818528d1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003bea80)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:43:14.999177    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:14 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "23e5d675-5201-4f3d-86b7-b25c818528d1", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:43:14.999231    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:14 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 23e5d675-5201-4f3d-86b7-b25c818528d1 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:43:14.999254    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:14 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:43:15.000658    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 DEBUG: hyperkit: Pid is 4035
	I0831 15:43:15.001119    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 0
	I0831 15:43:15.001129    4003 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:43:15.001211    4003 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 4035
	I0831 15:43:15.003022    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:43:15.003110    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0831 15:43:15.003135    4003 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 15:43:15.003157    4003 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39c5e}
	I0831 15:43:15.003193    4003 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 15:43:15.003213    4003 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ec75}
	I0831 15:43:15.003221    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetConfigRaw
	I0831 15:43:15.003228    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Found match: 92:7:3c:3f:ee:b7
	I0831 15:43:15.003263    4003 main.go:141] libmachine: (ha-949000-m02) DBG | IP: 192.169.0.6
	I0831 15:43:15.003898    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:43:15.004131    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:43:15.004587    4003 machine.go:93] provisionDockerMachine start ...
	I0831 15:43:15.004598    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:15.004713    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:15.004819    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:15.004915    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:15.005012    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:15.005089    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:15.005222    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:15.005366    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:43:15.005373    4003 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 15:43:15.008900    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:43:15.017748    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:43:15.018656    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:43:15.018679    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:43:15.018711    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:43:15.018731    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:43:15.399794    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:43:15.399810    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:43:15.514263    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:43:15.514282    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:43:15.514290    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:43:15.514296    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:43:15.515095    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:43:15.515105    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:43:21.084857    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:21 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:43:21.085024    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:21 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:43:21.085033    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:21 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:43:21.108855    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:21 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:43:50.068778    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 15:43:50.068792    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:43:50.068914    4003 buildroot.go:166] provisioning hostname "ha-949000-m02"
	I0831 15:43:50.068926    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:43:50.069013    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:50.069099    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:50.069176    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.069263    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.069336    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:50.069474    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:50.069630    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:43:50.069640    4003 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m02 && echo "ha-949000-m02" | sudo tee /etc/hostname
	I0831 15:43:50.130987    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m02
	
	I0831 15:43:50.131001    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:50.131142    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:50.131239    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.131330    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.131429    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:50.131565    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:50.131704    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:43:50.131716    4003 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:43:50.189171    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:43:50.189186    4003 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:43:50.189202    4003 buildroot.go:174] setting up certificates
	I0831 15:43:50.189208    4003 provision.go:84] configureAuth start
	I0831 15:43:50.189215    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:43:50.189354    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:43:50.189440    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:50.189529    4003 provision.go:143] copyHostCerts
	I0831 15:43:50.189563    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:43:50.189610    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:43:50.189616    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:43:50.189739    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:43:50.189940    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:43:50.189969    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:43:50.189973    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:43:50.190084    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:43:50.190251    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:43:50.190286    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:43:50.190291    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:43:50.190364    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:43:50.190554    4003 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m02 san=[127.0.0.1 192.169.0.6 ha-949000-m02 localhost minikube]
	I0831 15:43:50.447994    4003 provision.go:177] copyRemoteCerts
	I0831 15:43:50.448048    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:43:50.448062    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:50.448197    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:50.448289    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.448376    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:50.448469    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:43:50.481386    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:43:50.481457    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:43:50.500479    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:43:50.500539    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:43:50.519580    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:43:50.519638    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0831 15:43:50.538582    4003 provision.go:87] duration metric: took 349.361412ms to configureAuth
	I0831 15:43:50.538594    4003 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:43:50.538767    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:43:50.538781    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:50.538915    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:50.539010    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:50.539090    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.539170    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.539253    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:50.539350    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:50.539469    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:43:50.539477    4003 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:43:50.589461    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:43:50.589472    4003 buildroot.go:70] root file system type: tmpfs
	I0831 15:43:50.589565    4003 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:43:50.589575    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:50.589709    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:50.589808    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.589904    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.589997    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:50.590114    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:50.590247    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:43:50.590295    4003 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:43:50.650656    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:43:50.650675    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:50.650817    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:50.650904    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.650975    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.651066    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:50.651189    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:50.651328    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:43:50.651340    4003 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:43:52.319769    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:43:52.319783    4003 machine.go:96] duration metric: took 37.314787706s to provisionDockerMachine
	I0831 15:43:52.319791    4003 start.go:293] postStartSetup for "ha-949000-m02" (driver="hyperkit")
	I0831 15:43:52.319799    4003 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:43:52.319809    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:52.319999    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:43:52.320012    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:52.320113    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:52.320206    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:52.320293    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:52.320379    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:43:52.352031    4003 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:43:52.355233    4003 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:43:52.355244    4003 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:43:52.355330    4003 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:43:52.355466    4003 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:43:52.355473    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:43:52.355627    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:43:52.362886    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:43:52.382898    4003 start.go:296] duration metric: took 63.098255ms for postStartSetup
	I0831 15:43:52.382918    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:52.383098    4003 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0831 15:43:52.383110    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:52.383181    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:52.383271    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:52.383354    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:52.383436    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:43:52.415810    4003 machine.go:197] restoring vm config from /var/lib/minikube/backup: [etc]
	I0831 15:43:52.415864    4003 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0831 15:43:52.449230    4003 fix.go:56] duration metric: took 37.555176154s for fixHost
	I0831 15:43:52.449254    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:52.449385    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:52.449479    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:52.449570    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:52.449656    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:52.449784    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:52.449933    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:43:52.449941    4003 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:43:52.500604    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725144232.566642995
	
	I0831 15:43:52.500618    4003 fix.go:216] guest clock: 1725144232.566642995
	I0831 15:43:52.500629    4003 fix.go:229] Guest: 2024-08-31 15:43:52.566642995 -0700 PDT Remote: 2024-08-31 15:43:52.449243 -0700 PDT m=+56.586086649 (delta=117.399995ms)
	I0831 15:43:52.500641    4003 fix.go:200] guest clock delta is within tolerance: 117.399995ms
	I0831 15:43:52.500644    4003 start.go:83] releasing machines lock for "ha-949000-m02", held for 37.60661602s
	I0831 15:43:52.500661    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:52.500790    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:43:52.524083    4003 out.go:177] * Found network options:
	I0831 15:43:52.545377    4003 out.go:177]   - NO_PROXY=192.169.0.5
	W0831 15:43:52.567312    4003 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:43:52.567341    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:52.567964    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:52.568161    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:52.568240    4003 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:43:52.568275    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	W0831 15:43:52.568384    4003 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:43:52.568419    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:52.568477    4003 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:43:52.568494    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:52.568580    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:52.568637    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:52.568715    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:52.568763    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:52.568895    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:52.568930    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:43:52.569064    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	W0831 15:43:52.598238    4003 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:43:52.598301    4003 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:43:52.641479    4003 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:43:52.641502    4003 start.go:495] detecting cgroup driver to use...
	I0831 15:43:52.641620    4003 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:43:52.657762    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:43:52.666682    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:43:52.675584    4003 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:43:52.675632    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:43:52.684590    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:43:52.693450    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:43:52.702203    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:43:52.711110    4003 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:43:52.720178    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:43:52.729030    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:43:52.738456    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:43:52.748149    4003 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:43:52.756790    4003 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:43:52.765391    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:52.862859    4003 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:43:52.883299    4003 start.go:495] detecting cgroup driver to use...
	I0831 15:43:52.883366    4003 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:43:52.900841    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:43:52.911689    4003 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:43:52.925373    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:43:52.936790    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:43:52.947768    4003 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:43:52.969807    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:43:52.980241    4003 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:43:52.995125    4003 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:43:52.998026    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:43:53.005290    4003 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:43:53.018832    4003 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:43:53.124064    4003 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:43:53.226798    4003 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:43:53.226820    4003 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:43:53.241337    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:53.342509    4003 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:43:55.695532    4003 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.352978813s)
	I0831 15:43:55.695593    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:43:55.706164    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:43:55.716443    4003 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:43:55.813069    4003 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:43:55.914225    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:56.017829    4003 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:43:56.031977    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:43:56.043082    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:56.147482    4003 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:43:56.211631    4003 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:43:56.211708    4003 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:43:56.216202    4003 start.go:563] Will wait 60s for crictl version
	I0831 15:43:56.216251    4003 ssh_runner.go:195] Run: which crictl
	I0831 15:43:56.223176    4003 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:43:56.247497    4003 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:43:56.247568    4003 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:43:56.264978    4003 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:43:56.322638    4003 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:43:56.344590    4003 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:43:56.365748    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:43:56.366152    4003 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:43:56.370681    4003 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:43:56.380351    4003 mustload.go:65] Loading cluster: ha-949000
	I0831 15:43:56.380517    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:43:56.380743    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:43:56.380758    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:43:56.389551    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52095
	I0831 15:43:56.390006    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:43:56.390330    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:43:56.390341    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:43:56.390567    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:43:56.390683    4003 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:43:56.390760    4003 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:43:56.390827    4003 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 4017
	I0831 15:43:56.391784    4003 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:43:56.392030    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:43:56.392047    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:43:56.400646    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52097
	I0831 15:43:56.401071    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:43:56.401432    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:43:56.401449    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:43:56.401654    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:43:56.401763    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:56.401863    4003 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.6
	I0831 15:43:56.401868    4003 certs.go:194] generating shared ca certs ...
	I0831 15:43:56.401876    4003 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:43:56.402014    4003 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:43:56.402069    4003 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:43:56.402077    4003 certs.go:256] generating profile certs ...
	I0831 15:43:56.402165    4003 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:43:56.402256    4003 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.2cd83952
	I0831 15:43:56.402304    4003 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:43:56.402311    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:43:56.402331    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:43:56.402351    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:43:56.402368    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:43:56.402387    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:43:56.402405    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:43:56.402427    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:43:56.402445    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:43:56.402522    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:43:56.402560    4003 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:43:56.402572    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:43:56.402605    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:43:56.402639    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:43:56.402671    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:43:56.402737    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:43:56.402769    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:56.402811    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:43:56.402831    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:43:56.402857    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:56.402948    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:56.403031    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:56.403124    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:56.403213    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:43:56.428694    4003 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0831 15:43:56.431875    4003 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0831 15:43:56.440490    4003 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0831 15:43:56.443670    4003 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0831 15:43:56.452165    4003 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0831 15:43:56.455207    4003 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0831 15:43:56.463624    4003 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0831 15:43:56.466671    4003 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1679 bytes)
	I0831 15:43:56.475535    4003 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0831 15:43:56.478615    4003 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0831 15:43:56.487110    4003 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0831 15:43:56.490238    4003 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0831 15:43:56.498895    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:43:56.519238    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:43:56.539011    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:43:56.558598    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:43:56.578234    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1432 bytes)
	I0831 15:43:56.597888    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0831 15:43:56.617519    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:43:56.637284    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:43:56.657084    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:43:56.676448    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:43:56.696310    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:43:56.715741    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0831 15:43:56.729513    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0831 15:43:56.743001    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0831 15:43:56.756453    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1679 bytes)
	I0831 15:43:56.770115    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0831 15:43:56.784073    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0831 15:43:56.797658    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0831 15:43:56.810908    4003 ssh_runner.go:195] Run: openssl version
	I0831 15:43:56.815001    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:43:56.823241    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:56.826641    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:56.826682    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:56.830949    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:43:56.839331    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:43:56.847777    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:43:56.851154    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:43:56.851190    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:43:56.855448    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:43:56.863829    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:43:56.872178    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:43:56.875731    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:43:56.875765    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:43:56.879995    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:43:56.888471    4003 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:43:56.892039    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0831 15:43:56.896510    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0831 15:43:56.900794    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0831 15:43:56.904975    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0831 15:43:56.909175    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0831 15:43:56.913367    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0831 15:43:56.917519    4003 kubeadm.go:934] updating node {m02 192.169.0.6 8443 v1.31.0 docker true true} ...
	I0831 15:43:56.917575    4003 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.6
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:43:56.917596    4003 kube-vip.go:115] generating kube-vip config ...
	I0831 15:43:56.917626    4003 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:43:56.929983    4003 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:43:56.930030    4003 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:43:56.930087    4003 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:43:56.938650    4003 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:43:56.938693    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0831 15:43:56.948188    4003 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:43:56.962082    4003 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:43:56.975374    4003 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:43:56.989089    4003 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:43:56.991924    4003 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:43:57.001250    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:57.094190    4003 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:43:57.108747    4003 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:43:57.108933    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:43:57.130218    4003 out.go:177] * Verifying Kubernetes components...
	I0831 15:43:57.171663    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:57.293447    4003 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:43:57.304999    4003 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:43:57.305203    4003 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52edc00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:43:57.305240    4003 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:43:57.305411    4003 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m02" to be "Ready" ...
	I0831 15:43:57.305492    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:43:57.305497    4003 round_trippers.go:469] Request Headers:
	I0831 15:43:57.305505    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:43:57.305514    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:05.959710    4003 round_trippers.go:574] Response Status: 200 OK in 8654 milliseconds
	I0831 15:44:05.960438    4003 node_ready.go:49] node "ha-949000-m02" has status "Ready":"True"
	I0831 15:44:05.960449    4003 node_ready.go:38] duration metric: took 8.6549293s for node "ha-949000-m02" to be "Ready" ...
	I0831 15:44:05.960456    4003 pod_ready.go:36] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:44:05.960491    4003 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I0831 15:44:05.960499    4003 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I0831 15:44:05.960533    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:44:05.960537    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:05.960545    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:05.960552    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:05.970871    4003 round_trippers.go:574] Response Status: 200 OK in 10 milliseconds
	I0831 15:44:05.978345    4003 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:05.978408    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:44:05.978422    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:05.978429    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:05.978433    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:05.985369    4003 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:44:05.985815    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:05.985824    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:05.985830    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:05.985833    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:05.991184    4003 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0831 15:44:05.991513    4003 pod_ready.go:93] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:05.991523    4003 pod_ready.go:82] duration metric: took 13.160164ms for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:05.991530    4003 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:05.991572    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-snq8s
	I0831 15:44:05.991577    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:05.991582    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:05.991587    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.000332    4003 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0831 15:44:06.000855    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:06.000863    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.000872    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.000878    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.013265    4003 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0831 15:44:06.013530    4003 pod_ready.go:93] pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:06.013539    4003 pod_ready.go:82] duration metric: took 22.004461ms for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.013546    4003 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.013590    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000
	I0831 15:44:06.013595    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.013601    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.013603    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.020268    4003 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:44:06.020643    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:06.020651    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.020657    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.020661    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.027711    4003 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0831 15:44:06.028254    4003 pod_ready.go:93] pod "etcd-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:06.028264    4003 pod_ready.go:82] duration metric: took 14.711969ms for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.028272    4003 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.028311    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m02
	I0831 15:44:06.028316    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.028322    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.028326    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.039178    4003 round_trippers.go:574] Response Status: 200 OK in 10 milliseconds
	I0831 15:44:06.039603    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:06.039612    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.039618    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.039621    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.041381    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:06.041651    4003 pod_ready.go:93] pod "etcd-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:06.041661    4003 pod_ready.go:82] duration metric: took 13.384756ms for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.041667    4003 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.041704    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m03
	I0831 15:44:06.041709    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.041715    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.041718    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.043280    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:06.161143    4003 request.go:632] Waited for 117.478694ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:06.161211    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:06.161216    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.161222    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.161225    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.165879    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:44:06.166023    4003 pod_ready.go:98] node "ha-949000-m03" hosting pod "etcd-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:06.166034    4003 pod_ready.go:82] duration metric: took 124.360492ms for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	E0831 15:44:06.166042    4003 pod_ready.go:67] WaitExtra: waitPodCondition: node "ha-949000-m03" hosting pod "etcd-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:06.166052    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.361793    4003 request.go:632] Waited for 195.664438ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:44:06.361828    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:44:06.361833    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.361839    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.361847    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.363761    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:06.561193    4003 request.go:632] Waited for 196.830957ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:06.561252    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:06.561266    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.561279    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.561292    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.564567    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:06.565116    4003 pod_ready.go:93] pod "kube-apiserver-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:06.565128    4003 pod_ready.go:82] duration metric: took 399.063144ms for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.565137    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.761258    4003 request.go:632] Waited for 195.975667ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:44:06.761325    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:44:06.761334    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.761351    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.761363    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.764874    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:06.960633    4003 request.go:632] Waited for 195.219559ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:06.960666    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:06.960695    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.960702    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.960706    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.966407    4003 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0831 15:44:06.966698    4003 pod_ready.go:93] pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:06.966707    4003 pod_ready.go:82] duration metric: took 401.560896ms for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.966714    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:07.161478    4003 request.go:632] Waited for 194.704872ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:44:07.161625    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:44:07.161636    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:07.161647    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:07.161656    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:07.165538    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:07.361967    4003 request.go:632] Waited for 195.95763ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:07.362001    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:07.362006    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:07.362012    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:07.362016    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:07.363942    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:44:07.364015    4003 pod_ready.go:98] node "ha-949000-m03" hosting pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:07.364027    4003 pod_ready.go:82] duration metric: took 397.303245ms for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	E0831 15:44:07.364034    4003 pod_ready.go:67] WaitExtra: waitPodCondition: node "ha-949000-m03" hosting pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:07.364047    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:07.561375    4003 request.go:632] Waited for 197.282382ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:44:07.561418    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:44:07.561424    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:07.561430    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:07.561434    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:07.563374    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:07.761585    4003 request.go:632] Waited for 197.505917ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:07.761680    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:07.761692    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:07.761703    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:07.761710    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:07.765076    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:07.765411    4003 pod_ready.go:93] pod "kube-controller-manager-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:07.765423    4003 pod_ready.go:82] duration metric: took 401.363562ms for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:07.765432    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:07.961150    4003 request.go:632] Waited for 195.676394ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:44:07.961210    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:44:07.961216    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:07.961223    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:07.961232    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:07.963936    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:08.160774    4003 request.go:632] Waited for 196.46087ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:08.160847    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:08.160855    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:08.160863    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:08.160885    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:08.163147    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:08.163737    4003 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:08.163748    4003 pod_ready.go:82] duration metric: took 398.305248ms for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:08.163756    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:08.360946    4003 request.go:632] Waited for 197.148459ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:44:08.361013    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:44:08.361018    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:08.361025    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:08.361030    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:08.363306    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:08.561349    4003 request.go:632] Waited for 197.594231ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:08.561505    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:08.561518    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:08.561529    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:08.561536    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:08.564572    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:44:08.564661    4003 pod_ready.go:98] node "ha-949000-m03" hosting pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:08.564674    4003 pod_ready.go:82] duration metric: took 400.906717ms for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	E0831 15:44:08.564683    4003 pod_ready.go:67] WaitExtra: waitPodCondition: node "ha-949000-m03" hosting pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:08.564694    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:08.760636    4003 request.go:632] Waited for 195.893531ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:08.760715    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:08.760720    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:08.760726    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:08.760729    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:08.763646    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:08.961865    4003 request.go:632] Waited for 197.701917ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:08.961922    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:08.961933    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:08.961945    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:08.961952    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:08.964688    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:09.160991    4003 request.go:632] Waited for 95.682906ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:09.161056    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:09.161066    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:09.161078    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:09.161088    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:09.164212    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:09.360988    4003 request.go:632] Waited for 196.217621ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:09.361022    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:09.361027    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:09.361055    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:09.361059    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:09.363713    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:09.564888    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:09.564900    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:09.564907    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:09.564913    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:09.568623    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:09.760895    4003 request.go:632] Waited for 191.666981ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:09.760944    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:09.760952    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:09.760958    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:09.760962    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:09.763257    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:10.065958    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:10.065982    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:10.065993    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:10.065998    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:10.069180    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:10.162666    4003 request.go:632] Waited for 93.035977ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:10.162750    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:10.162767    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:10.162780    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:10.162786    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:10.165653    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:10.565356    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:10.565380    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:10.565391    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:10.565397    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:10.568883    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:10.569642    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:10.569650    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:10.569655    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:10.569658    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:10.571069    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:10.571366    4003 pod_ready.go:103] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"False"
	I0831 15:44:11.066968    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:11.066994    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:11.067006    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:11.067011    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:11.070763    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:11.071322    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:11.071330    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:11.071335    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:11.071339    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:11.072824    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:11.565282    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:11.565303    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:11.565314    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:11.565320    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:11.568672    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:11.569364    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:11.569371    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:11.569378    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:11.569381    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:11.571110    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:12.065991    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:12.066013    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:12.066025    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:12.066038    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:12.070105    4003 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:44:12.070531    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:12.070540    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:12.070548    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:12.070553    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:12.072400    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:12.566716    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:12.566745    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:12.566756    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:12.566762    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:12.570548    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:12.570980    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:12.570991    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:12.571000    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:12.571005    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:12.573075    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:12.573392    4003 pod_ready.go:103] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"False"
	I0831 15:44:13.065503    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:13.065529    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:13.065540    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:13.065545    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:13.069028    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:13.069606    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:13.069616    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:13.069624    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:13.069628    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:13.071291    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:13.566706    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:13.566719    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:13.566724    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:13.566727    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:13.568695    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:13.569316    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:13.569324    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:13.569330    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:13.569340    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:13.570910    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:14.066070    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:14.066097    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:14.066110    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:14.066122    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:14.069846    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:14.070280    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:14.070288    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:14.070294    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:14.070298    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:14.071983    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:14.565072    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:14.565092    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:14.565103    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:14.565121    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:14.568991    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:14.569470    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:14.569478    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:14.569486    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:14.569489    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:14.571194    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:15.065570    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:15.065590    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:15.065602    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:15.065608    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:15.069259    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:15.069742    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:15.069750    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:15.069756    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:15.069761    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:15.071256    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:15.071608    4003 pod_ready.go:103] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"False"
	I0831 15:44:15.565664    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:15.565722    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:15.565736    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:15.565743    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:15.568446    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:15.568953    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:15.568960    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:15.568966    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:15.568969    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:15.570393    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:16.066647    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:16.066673    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:16.066683    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:16.066689    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:16.069968    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:16.070667    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:16.070678    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:16.070686    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:16.070700    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:16.072421    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:16.565080    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:16.565093    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:16.565100    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:16.565105    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:16.567016    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:16.567805    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:16.567814    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:16.567819    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:16.567829    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:16.569508    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:17.065211    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:17.065233    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:17.065243    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:17.065249    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:17.068848    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:17.069431    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:17.069442    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:17.069451    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:17.069454    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:17.071237    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:17.565694    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:17.565715    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:17.565726    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:17.565732    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:17.569041    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:17.569625    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:17.569632    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:17.569638    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:17.569648    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:17.571537    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:17.572005    4003 pod_ready.go:103] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"False"
	I0831 15:44:18.065338    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:18.065353    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:18.065361    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:18.065365    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:18.067574    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:18.067956    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:18.067963    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:18.067969    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:18.067973    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:18.069437    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:18.565941    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:18.565963    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:18.565974    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:18.565984    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:18.569115    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:18.569832    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:18.569842    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:18.569850    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:18.569854    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:18.571574    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:19.065517    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:19.065533    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:19.065540    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:19.065545    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:19.068125    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:19.068655    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:19.068662    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:19.068667    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:19.068672    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:19.070197    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:19.566293    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:19.566372    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:19.566385    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:19.566395    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:19.569750    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:19.570211    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:19.570219    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:19.570224    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:19.570229    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:19.571922    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:19.572415    4003 pod_ready.go:103] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"False"
	I0831 15:44:20.065051    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:20.065066    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:20.065073    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:20.065078    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:20.070133    4003 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0831 15:44:20.070557    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:20.070565    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:20.070570    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:20.070573    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:20.072277    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:20.566009    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:20.566031    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:20.566042    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:20.566051    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:20.570001    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:20.570447    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:20.570453    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:20.570458    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:20.570460    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:20.572199    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:21.065187    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:21.065210    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:21.065222    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:21.065227    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:21.067898    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:21.068345    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:21.068353    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:21.068358    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:21.068362    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:21.069742    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:21.565705    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:21.565724    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:21.565735    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:21.565741    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:21.568938    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:21.569630    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:21.569641    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:21.569647    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:21.569655    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:21.571194    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:22.065027    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:22.065051    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:22.065062    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:22.065100    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:22.069375    4003 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:44:22.069729    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:22.069737    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:22.069743    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:22.069747    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:22.072208    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:22.072476    4003 pod_ready.go:103] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"False"
	I0831 15:44:22.565894    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:22.565917    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:22.565928    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:22.565937    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:22.569490    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:22.569886    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:22.569893    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:22.569899    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:22.569903    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:22.571462    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:23.066179    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:23.066201    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:23.066212    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:23.066219    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:23.070218    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:23.070845    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:23.070855    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:23.070862    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:23.070867    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:23.072481    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:23.565085    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:23.565099    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:23.565105    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:23.565109    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:23.567899    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:23.568397    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:23.568405    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:23.568411    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:23.568431    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:23.571121    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:24.065227    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:24.065249    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:24.065261    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:24.065270    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:24.068196    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:24.068774    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:24.068782    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:24.068787    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:24.068791    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:24.070317    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:24.565319    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:24.565332    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:24.565337    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:24.565340    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:24.567104    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:24.567586    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:24.567594    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:24.567600    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:24.567603    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:24.569279    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:24.569664    4003 pod_ready.go:103] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"False"
	I0831 15:44:25.066218    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:25.066244    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.066255    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.066260    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.069969    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:25.070824    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:25.070848    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.070854    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.070863    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.072406    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.072709    4003 pod_ready.go:93] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:25.072718    4003 pod_ready.go:82] duration metric: took 16.507839534s for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.072727    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.072755    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-d45q5
	I0831 15:44:25.072760    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.072765    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.072769    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.074170    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.074584    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:25.074591    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.074596    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.074599    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.076066    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:44:25.076162    4003 pod_ready.go:98] node "ha-949000-m03" hosting pod "kube-proxy-d45q5" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:25.076170    4003 pod_ready.go:82] duration metric: took 3.437579ms for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	E0831 15:44:25.076175    4003 pod_ready.go:67] WaitExtra: waitPodCondition: node "ha-949000-m03" hosting pod "kube-proxy-d45q5" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:25.076179    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.076206    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:44:25.076211    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.076216    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.076219    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.077746    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.078120    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:25.078127    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.078133    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.078136    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.079498    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.079894    4003 pod_ready.go:93] pod "kube-proxy-q7ndn" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:25.079903    4003 pod_ready.go:82] duration metric: took 3.717598ms for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.079909    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.079936    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:44:25.079941    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.079946    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.079951    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.081600    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.081932    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:25.081940    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.081946    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.081949    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.083262    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.083552    4003 pod_ready.go:93] pod "kube-scheduler-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:25.083561    4003 pod_ready.go:82] duration metric: took 3.647661ms for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.083567    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.083594    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:44:25.083603    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.083609    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.083614    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.085111    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.085438    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:25.085446    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.085452    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.085455    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.087068    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.087348    4003 pod_ready.go:93] pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:25.087357    4003 pod_ready.go:82] duration metric: took 3.784951ms for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.087363    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.267294    4003 request.go:632] Waited for 179.857802ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:44:25.267367    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:44:25.267377    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.267389    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.267395    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.271053    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:25.466554    4003 request.go:632] Waited for 195.015611ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:25.466691    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:25.466701    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.466712    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.466721    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.470050    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:44:25.470127    4003 pod_ready.go:98] node "ha-949000-m03" hosting pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:25.470148    4003 pod_ready.go:82] duration metric: took 382.775358ms for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	E0831 15:44:25.470158    4003 pod_ready.go:67] WaitExtra: waitPodCondition: node "ha-949000-m03" hosting pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:25.470165    4003 pod_ready.go:39] duration metric: took 19.509491295s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:44:25.470190    4003 api_server.go:52] waiting for apiserver process to appear ...
	I0831 15:44:25.470257    4003 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:44:25.483780    4003 api_server.go:72] duration metric: took 28.374703678s to wait for apiserver process to appear ...
	I0831 15:44:25.483792    4003 api_server.go:88] waiting for apiserver healthz status ...
	I0831 15:44:25.483807    4003 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0831 15:44:25.486833    4003 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0831 15:44:25.486870    4003 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0831 15:44:25.486875    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.486882    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.486887    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.487354    4003 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 15:44:25.487409    4003 api_server.go:141] control plane version: v1.31.0
	I0831 15:44:25.487417    4003 api_server.go:131] duration metric: took 3.620759ms to wait for apiserver health ...
	I0831 15:44:25.487424    4003 system_pods.go:43] waiting for kube-system pods to appear ...
	I0831 15:44:25.666509    4003 request.go:632] Waited for 179.03877ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:44:25.666550    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:44:25.666557    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.666565    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.666601    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.670513    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:25.675202    4003 system_pods.go:59] 24 kube-system pods found
	I0831 15:44:25.675220    4003 system_pods.go:61] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:44:25.675225    4003 system_pods.go:61] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:44:25.675229    4003 system_pods.go:61] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:44:25.675232    4003 system_pods.go:61] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:44:25.675236    4003 system_pods.go:61] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:44:25.675238    4003 system_pods.go:61] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:44:25.675241    4003 system_pods.go:61] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:44:25.675244    4003 system_pods.go:61] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:44:25.675247    4003 system_pods.go:61] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:44:25.675249    4003 system_pods.go:61] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:44:25.675252    4003 system_pods.go:61] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:44:25.675255    4003 system_pods.go:61] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:44:25.675258    4003 system_pods.go:61] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:44:25.675261    4003 system_pods.go:61] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:44:25.675263    4003 system_pods.go:61] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:44:25.675266    4003 system_pods.go:61] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:44:25.675268    4003 system_pods.go:61] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:44:25.675271    4003 system_pods.go:61] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:44:25.675274    4003 system_pods.go:61] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:44:25.675280    4003 system_pods.go:61] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:44:25.675283    4003 system_pods.go:61] "kube-vip-ha-949000" [98967a2c-6641-4193-b7ce-c0fbdee58344] Running
	I0831 15:44:25.675286    4003 system_pods.go:61] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:44:25.675288    4003 system_pods.go:61] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:44:25.675292    4003 system_pods.go:61] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:44:25.675296    4003 system_pods.go:74] duration metric: took 187.866388ms to wait for pod list to return data ...
	I0831 15:44:25.675301    4003 default_sa.go:34] waiting for default service account to be created ...
	I0831 15:44:25.866631    4003 request.go:632] Waited for 191.264353ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:44:25.866761    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:44:25.866771    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.866783    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.866789    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.870307    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:25.870649    4003 default_sa.go:45] found service account: "default"
	I0831 15:44:25.870663    4003 default_sa.go:55] duration metric: took 195.354455ms for default service account to be created ...
	I0831 15:44:25.870670    4003 system_pods.go:116] waiting for k8s-apps to be running ...
	I0831 15:44:26.067233    4003 request.go:632] Waited for 196.47603ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:44:26.067280    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:44:26.067290    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:26.067301    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:26.067307    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:26.072107    4003 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:44:26.077415    4003 system_pods.go:86] 24 kube-system pods found
	I0831 15:44:26.077426    4003 system_pods.go:89] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:44:26.077431    4003 system_pods.go:89] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:44:26.077435    4003 system_pods.go:89] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:44:26.077439    4003 system_pods.go:89] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:44:26.077442    4003 system_pods.go:89] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:44:26.077446    4003 system_pods.go:89] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:44:26.077448    4003 system_pods.go:89] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:44:26.077451    4003 system_pods.go:89] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:44:26.077454    4003 system_pods.go:89] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:44:26.077459    4003 system_pods.go:89] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:44:26.077462    4003 system_pods.go:89] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:44:26.077467    4003 system_pods.go:89] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:44:26.077470    4003 system_pods.go:89] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:44:26.077473    4003 system_pods.go:89] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:44:26.077477    4003 system_pods.go:89] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:44:26.077479    4003 system_pods.go:89] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:44:26.077482    4003 system_pods.go:89] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:44:26.077485    4003 system_pods.go:89] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:44:26.077488    4003 system_pods.go:89] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:44:26.077491    4003 system_pods.go:89] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:44:26.077494    4003 system_pods.go:89] "kube-vip-ha-949000" [98967a2c-6641-4193-b7ce-c0fbdee58344] Running
	I0831 15:44:26.077497    4003 system_pods.go:89] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:44:26.077499    4003 system_pods.go:89] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:44:26.077502    4003 system_pods.go:89] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:44:26.077506    4003 system_pods.go:126] duration metric: took 206.829ms to wait for k8s-apps to be running ...
	I0831 15:44:26.077512    4003 system_svc.go:44] waiting for kubelet service to be running ....
	I0831 15:44:26.077564    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:44:26.088970    4003 system_svc.go:56] duration metric: took 11.450852ms WaitForService to wait for kubelet
	I0831 15:44:26.088985    4003 kubeadm.go:582] duration metric: took 28.979903586s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:44:26.088998    4003 node_conditions.go:102] verifying NodePressure condition ...
	I0831 15:44:26.266791    4003 request.go:632] Waited for 177.710266ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0831 15:44:26.266867    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0831 15:44:26.266875    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:26.266886    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:26.266896    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:26.270407    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:26.271146    4003 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:44:26.271167    4003 node_conditions.go:123] node cpu capacity is 2
	I0831 15:44:26.271180    4003 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:44:26.271188    4003 node_conditions.go:123] node cpu capacity is 2
	I0831 15:44:26.271193    4003 node_conditions.go:105] duration metric: took 182.189243ms to run NodePressure ...
	I0831 15:44:26.271203    4003 start.go:241] waiting for startup goroutines ...
	I0831 15:44:26.271229    4003 start.go:255] writing updated cluster config ...
	I0831 15:44:26.293325    4003 out.go:201] 
	I0831 15:44:26.315324    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:44:26.315453    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:44:26.337988    4003 out.go:177] * Starting "ha-949000-m04" worker node in "ha-949000" cluster
	I0831 15:44:26.380685    4003 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:44:26.380719    4003 cache.go:56] Caching tarball of preloaded images
	I0831 15:44:26.380921    4003 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:44:26.380941    4003 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:44:26.381080    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:44:26.382207    4003 start.go:360] acquireMachinesLock for ha-949000-m04: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:44:26.382290    4003 start.go:364] duration metric: took 66.399µs to acquireMachinesLock for "ha-949000-m04"
	I0831 15:44:26.382307    4003 start.go:96] Skipping create...Using existing machine configuration
	I0831 15:44:26.382314    4003 fix.go:54] fixHost starting: m04
	I0831 15:44:26.382612    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:44:26.382638    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:44:26.391652    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52102
	I0831 15:44:26.391986    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:44:26.392342    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:44:26.392365    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:44:26.392613    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:44:26.392733    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:44:26.392824    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetState
	I0831 15:44:26.392912    4003 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:44:26.392996    4003 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3806
	I0831 15:44:26.393933    4003 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid 3806 missing from process table
	I0831 15:44:26.393956    4003 fix.go:112] recreateIfNeeded on ha-949000-m04: state=Stopped err=<nil>
	I0831 15:44:26.393965    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	W0831 15:44:26.394099    4003 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 15:44:26.414853    4003 out.go:177] * Restarting existing hyperkit VM for "ha-949000-m04" ...
	I0831 15:44:26.456728    4003 main.go:141] libmachine: (ha-949000-m04) Calling .Start
	I0831 15:44:26.457073    4003 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:44:26.457142    4003 main.go:141] libmachine: (ha-949000-m04) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid
	I0831 15:44:26.457233    4003 main.go:141] libmachine: (ha-949000-m04) DBG | Using UUID 5ee34770-2239-4427-9789-bd204fe095a6
	I0831 15:44:26.482643    4003 main.go:141] libmachine: (ha-949000-m04) DBG | Generated MAC 8a:3c:61:5f:c5:84
	I0831 15:44:26.482668    4003 main.go:141] libmachine: (ha-949000-m04) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:44:26.482825    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"5ee34770-2239-4427-9789-bd204fe095a6", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001201e0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:44:26.482873    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"5ee34770-2239-4427-9789-bd204fe095a6", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001201e0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:44:26.482921    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "5ee34770-2239-4427-9789-bd204fe095a6", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/ha-949000-m04.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m04/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:44:26.482962    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 5ee34770-2239-4427-9789-bd204fe095a6 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/ha-949000-m04.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:44:26.482975    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:44:26.484373    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 DEBUG: hyperkit: Pid is 4071
	I0831 15:44:26.484859    4003 main.go:141] libmachine: (ha-949000-m04) DBG | Attempt 0
	I0831 15:44:26.484876    4003 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:44:26.484959    4003 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 4071
	I0831 15:44:26.487135    4003 main.go:141] libmachine: (ha-949000-m04) DBG | Searching for 8a:3c:61:5f:c5:84 in /var/db/dhcpd_leases ...
	I0831 15:44:26.487196    4003 main.go:141] libmachine: (ha-949000-m04) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0831 15:44:26.487221    4003 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 15:44:26.487236    4003 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 15:44:26.487249    4003 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39c5e}
	I0831 15:44:26.487264    4003 main.go:141] libmachine: (ha-949000-m04) DBG | Found match: 8a:3c:61:5f:c5:84
	I0831 15:44:26.487276    4003 main.go:141] libmachine: (ha-949000-m04) DBG | IP: 192.169.0.8
	I0831 15:44:26.487302    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetConfigRaw
	I0831 15:44:26.488058    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:44:26.488267    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:44:26.488733    4003 machine.go:93] provisionDockerMachine start ...
	I0831 15:44:26.488743    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:44:26.488866    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:44:26.488967    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:44:26.489052    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:44:26.489152    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:44:26.489235    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:44:26.489342    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:44:26.489512    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:44:26.489524    4003 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 15:44:26.492093    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:44:26.500227    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:44:26.501190    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:44:26.501211    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:44:26.501222    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:44:26.501234    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:44:26.887163    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:44:26.887179    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:44:27.001897    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:44:27.001917    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:44:27.001935    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:44:27.001949    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:44:27.002783    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:27 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:44:27.002794    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:27 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:44:32.603005    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:32 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:44:32.603055    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:32 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:44:32.603066    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:32 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:44:32.626242    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:32 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:45:01.551772    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 15:45:01.551791    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetMachineName
	I0831 15:45:01.551924    4003 buildroot.go:166] provisioning hostname "ha-949000-m04"
	I0831 15:45:01.551935    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetMachineName
	I0831 15:45:01.552030    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:01.552119    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:01.552201    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.552291    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.552372    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:01.552497    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:45:01.552634    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:45:01.552642    4003 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m04 && echo "ha-949000-m04" | sudo tee /etc/hostname
	I0831 15:45:01.616885    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m04
	
	I0831 15:45:01.616906    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:01.617041    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:01.617145    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.617232    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.617317    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:01.617452    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:45:01.617606    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:45:01.617618    4003 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m04' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m04/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m04' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:45:01.675471    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:45:01.675486    4003 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:45:01.675499    4003 buildroot.go:174] setting up certificates
	I0831 15:45:01.675505    4003 provision.go:84] configureAuth start
	I0831 15:45:01.675512    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetMachineName
	I0831 15:45:01.675643    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:45:01.675763    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:01.675858    4003 provision.go:143] copyHostCerts
	I0831 15:45:01.675886    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:45:01.675959    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:45:01.675965    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:45:01.676118    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:45:01.676365    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:45:01.676407    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:45:01.676412    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:45:01.676500    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:45:01.676663    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:45:01.676709    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:45:01.676714    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:45:01.676793    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:45:01.676940    4003 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m04 san=[127.0.0.1 192.169.0.8 ha-949000-m04 localhost minikube]
	I0831 15:45:01.762314    4003 provision.go:177] copyRemoteCerts
	I0831 15:45:01.762367    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:45:01.762382    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:01.762557    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:01.762656    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.762756    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:01.762844    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:45:01.796205    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:45:01.796279    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:45:01.815211    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:45:01.815279    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:45:01.834188    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:45:01.834257    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0831 15:45:01.853640    4003 provision.go:87] duration metric: took 178.124085ms to configureAuth
	I0831 15:45:01.853653    4003 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:45:01.853819    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:45:01.853832    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:45:01.853954    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:01.854036    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:01.854122    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.854210    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.854294    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:01.854407    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:45:01.854531    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:45:01.854538    4003 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:45:01.906456    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:45:01.906469    4003 buildroot.go:70] root file system type: tmpfs
	I0831 15:45:01.906548    4003 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:45:01.906561    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:01.906689    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:01.906786    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.906885    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.906960    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:01.907078    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:45:01.907226    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:45:01.907270    4003 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:45:01.970284    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:45:01.970303    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:01.970453    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:01.970548    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.970632    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.970725    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:01.970876    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:45:01.971019    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:45:01.971040    4003 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:45:03.516394    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:45:03.516410    4003 machine.go:96] duration metric: took 37.027272003s to provisionDockerMachine
	I0831 15:45:03.516419    4003 start.go:293] postStartSetup for "ha-949000-m04" (driver="hyperkit")
	I0831 15:45:03.516426    4003 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:45:03.516446    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:45:03.516635    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:45:03.516649    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:03.516745    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:03.516831    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:03.516911    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:03.517003    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:45:03.549510    4003 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:45:03.552575    4003 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:45:03.552586    4003 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:45:03.552685    4003 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:45:03.552861    4003 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:45:03.552868    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:45:03.553075    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:45:03.560251    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:45:03.579932    4003 start.go:296] duration metric: took 63.505056ms for postStartSetup
	I0831 15:45:03.579953    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:45:03.580123    4003 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0831 15:45:03.580137    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:03.580227    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:03.580304    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:03.580383    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:03.580463    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:45:03.613355    4003 machine.go:197] restoring vm config from /var/lib/minikube/backup: [etc]
	I0831 15:45:03.613415    4003 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0831 15:45:03.667593    4003 fix.go:56] duration metric: took 37.284874453s for fixHost
	I0831 15:45:03.667632    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:03.667887    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:03.668092    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:03.668253    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:03.668442    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:03.668679    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:45:03.668942    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:45:03.668957    4003 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:45:03.721925    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725144303.791568584
	
	I0831 15:45:03.721940    4003 fix.go:216] guest clock: 1725144303.791568584
	I0831 15:45:03.721945    4003 fix.go:229] Guest: 2024-08-31 15:45:03.791568584 -0700 PDT Remote: 2024-08-31 15:45:03.667616 -0700 PDT m=+127.803695939 (delta=123.952584ms)
	I0831 15:45:03.721980    4003 fix.go:200] guest clock delta is within tolerance: 123.952584ms
	I0831 15:45:03.721984    4003 start.go:83] releasing machines lock for "ha-949000-m04", held for 37.339285395s
	I0831 15:45:03.722007    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:45:03.722145    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:45:03.745774    4003 out.go:177] * Found network options:
	I0831 15:45:03.767373    4003 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6
	W0831 15:45:03.788896    4003 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:45:03.788955    4003 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:45:03.788975    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:45:03.789814    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:45:03.790060    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:45:03.790166    4003 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:45:03.790203    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	W0831 15:45:03.790303    4003 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:45:03.790355    4003 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:45:03.790430    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:03.790462    4003 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:45:03.790479    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:03.790581    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:03.790645    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:03.790761    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:03.790846    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:03.790934    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:45:03.791028    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:03.791215    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	W0831 15:45:03.820995    4003 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:45:03.821055    4003 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:45:03.865115    4003 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:45:03.865138    4003 start.go:495] detecting cgroup driver to use...
	I0831 15:45:03.865245    4003 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:45:03.881224    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:45:03.890437    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:45:03.899610    4003 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:45:03.899666    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:45:03.908938    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:45:03.918184    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:45:03.927312    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:45:03.936702    4003 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:45:03.946157    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:45:03.955222    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:45:03.964152    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:45:03.973257    4003 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:45:03.981558    4003 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:45:03.989901    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:45:04.086014    4003 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:45:04.105538    4003 start.go:495] detecting cgroup driver to use...
	I0831 15:45:04.105610    4003 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:45:04.121430    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:45:04.134788    4003 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:45:04.151049    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:45:04.161844    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:45:04.172949    4003 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:45:04.191373    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:45:04.201771    4003 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:45:04.216770    4003 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:45:04.219760    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:45:04.226792    4003 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:45:04.240592    4003 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:45:04.340799    4003 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:45:04.439649    4003 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:45:04.439671    4003 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:45:04.453918    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:45:04.542337    4003 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:45:06.812888    4003 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.270508765s)
	I0831 15:45:06.812949    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:45:06.823181    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:45:06.833531    4003 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:45:06.936150    4003 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:45:07.044179    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:45:07.137898    4003 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:45:07.152258    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:45:07.163263    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:45:07.258016    4003 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:45:07.318759    4003 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:45:07.318841    4003 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:45:07.323364    4003 start.go:563] Will wait 60s for crictl version
	I0831 15:45:07.323422    4003 ssh_runner.go:195] Run: which crictl
	I0831 15:45:07.326572    4003 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:45:07.358444    4003 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:45:07.358520    4003 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:45:07.376088    4003 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:45:07.414824    4003 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:45:07.456544    4003 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:45:07.477408    4003 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0831 15:45:07.498401    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:45:07.498760    4003 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:45:07.503179    4003 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:45:07.513368    4003 mustload.go:65] Loading cluster: ha-949000
	I0831 15:45:07.513553    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:45:07.513782    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:45:07.513810    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:45:07.522673    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52124
	I0831 15:45:07.523026    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:45:07.523408    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:45:07.523425    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:45:07.523666    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:45:07.523786    4003 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:45:07.523871    4003 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:45:07.523962    4003 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 4017
	I0831 15:45:07.524938    4003 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:45:07.525205    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:45:07.525236    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:45:07.534543    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52126
	I0831 15:45:07.534878    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:45:07.535207    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:45:07.535219    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:45:07.535443    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:45:07.535559    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:45:07.535653    4003 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.8
	I0831 15:45:07.535660    4003 certs.go:194] generating shared ca certs ...
	I0831 15:45:07.535672    4003 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:45:07.535838    4003 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:45:07.535909    4003 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:45:07.535919    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:45:07.535943    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:45:07.535961    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:45:07.535978    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:45:07.536528    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:45:07.536755    4003 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:45:07.536797    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:45:07.536911    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:45:07.536985    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:45:07.537034    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:45:07.537191    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:45:07.537301    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:45:07.537538    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:45:07.537562    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:45:07.537590    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:45:07.557183    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:45:07.576458    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:45:07.595921    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:45:07.615402    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:45:07.634516    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:45:07.653693    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:45:07.673154    4003 ssh_runner.go:195] Run: openssl version
	I0831 15:45:07.677604    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:45:07.686971    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:45:07.690415    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:45:07.690457    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:45:07.694634    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:45:07.703764    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:45:07.713184    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:45:07.716497    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:45:07.716528    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:45:07.720770    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:45:07.729910    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:45:07.739116    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:45:07.742456    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:45:07.742497    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:45:07.746707    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:45:07.755769    4003 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:45:07.758843    4003 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 15:45:07.758878    4003 kubeadm.go:934] updating node {m04 192.169.0.8 0 v1.31.0  false true} ...
	I0831 15:45:07.758938    4003 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m04 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.8
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:45:07.758975    4003 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:45:07.767346    4003 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:45:07.767390    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system
	I0831 15:45:07.775359    4003 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:45:07.788534    4003 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:45:07.801886    4003 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:45:07.804685    4003 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:45:07.814373    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:45:07.913307    4003 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:45:07.928102    4003 start.go:235] Will wait 6m0s for node &{Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime: ControlPlane:false Worker:true}
	I0831 15:45:07.928288    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:45:07.970019    4003 out.go:177] * Verifying Kubernetes components...
	I0831 15:45:07.990872    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:45:08.095722    4003 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:45:08.845027    4003 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:45:08.845280    4003 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52edc00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:45:08.845323    4003 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:45:08.845512    4003 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m04" to be "Ready" ...
	I0831 15:45:08.845557    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:08.845562    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:08.845568    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:08.845571    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:08.847724    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:09.347758    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:09.347784    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:09.347795    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:09.347801    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:09.351055    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:09.846963    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:09.846989    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:09.847000    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:09.847007    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:09.850830    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:10.346983    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:10.346994    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:10.347000    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:10.347004    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:10.349173    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:10.845886    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:10.845909    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:10.845920    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:10.845929    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:10.848792    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:10.848857    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:11.347504    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:11.347528    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:11.347539    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:11.347545    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:11.350440    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:11.846697    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:11.846722    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:11.846744    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:11.846747    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:11.848994    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:12.346908    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:12.346932    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:12.346943    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:12.346949    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:12.349967    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:12.846545    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:12.846570    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:12.846582    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:12.846586    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:12.850076    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:12.850171    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:13.345681    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:13.345701    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:13.345708    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:13.345713    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:13.347803    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:13.846672    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:13.846700    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:13.846713    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:13.846719    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:13.850213    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:14.346092    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:14.346108    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:14.346114    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:14.346118    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:14.348283    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:14.846918    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:14.846932    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:14.846938    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:14.846941    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:14.849111    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:15.346636    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:15.346651    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:15.346661    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:15.346691    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:15.348385    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:15.348441    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:15.846720    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:15.846746    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:15.846757    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:15.846800    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:15.850040    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:16.346787    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:16.346802    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:16.346807    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:16.346810    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:16.349402    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:16.846242    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:16.846267    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:16.846279    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:16.846285    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:16.849465    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:17.346142    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:17.346155    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:17.346163    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:17.346166    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:17.350190    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:45:17.350267    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:17.846549    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:17.846563    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:17.846569    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:17.846574    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:17.848738    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:18.346533    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:18.346558    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:18.346628    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:18.346635    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:18.349746    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:18.845790    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:18.845803    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:18.845810    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:18.845813    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:18.852753    4003 round_trippers.go:574] Response Status: 404 Not Found in 6 milliseconds
	I0831 15:45:19.345910    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:19.345921    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:19.345927    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:19.345930    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:19.348239    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:19.846161    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:19.846188    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:19.846205    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:19.846222    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:19.849249    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:19.849335    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:20.347424    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:20.347504    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:20.347518    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:20.347524    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:20.350150    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:20.845819    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:20.845835    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:20.845842    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:20.845845    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:20.848156    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:21.347305    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:21.347322    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:21.347329    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:21.347334    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:21.349936    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:21.846477    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:21.846497    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:21.846509    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:21.846518    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:21.849139    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:22.346802    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:22.346822    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:22.346830    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:22.346842    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:22.348962    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:22.349019    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:22.847375    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:22.847401    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:22.847456    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:22.847466    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:22.850916    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:23.347018    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:23.347030    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:23.347037    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:23.347041    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:23.348873    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:23.846396    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:23.846412    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:23.846418    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:23.846421    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:23.848619    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:24.346563    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:24.346587    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:24.346598    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:24.346605    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:24.349517    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:24.349596    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:24.847762    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:24.847788    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:24.847799    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:24.847807    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:24.850902    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:25.346975    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:25.346987    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:25.346993    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:25.346996    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:25.349147    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:25.846141    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:25.846199    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:25.846211    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:25.846217    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:25.849027    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:26.346014    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:26.346036    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:26.346047    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:26.346053    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:26.349317    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:26.846724    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:26.846739    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:26.846745    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:26.846748    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:26.848768    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:26.848825    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:27.347046    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:27.347061    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:27.347084    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:27.347088    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:27.349358    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:27.847241    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:27.847266    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:27.847278    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:27.847284    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:27.850635    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:28.346098    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:28.346111    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:28.346118    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:28.346120    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:28.348238    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:28.846743    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:28.846769    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:28.846780    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:28.846788    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:28.850051    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:28.850126    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:29.347209    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:29.347223    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:29.347230    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:29.347234    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:29.349262    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:29.847853    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:29.847871    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:29.847899    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:29.847903    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:29.850095    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:30.346592    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:30.346613    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:30.346624    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:30.346630    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:30.349712    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:30.846746    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:30.846772    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:30.846782    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:30.846787    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:30.850071    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:30.850159    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:31.347223    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:31.347268    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:31.347276    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:31.347280    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:31.349187    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:31.846144    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:31.846163    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:31.846180    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:31.846184    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:31.848310    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:32.346217    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:32.346239    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:32.346248    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:32.346254    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:32.348537    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:32.846981    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:32.846996    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:32.847003    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:32.847010    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:32.848991    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:33.346415    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:33.346427    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:33.346433    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:33.346436    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:33.348444    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:33.348503    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:33.845996    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:33.846023    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:33.846066    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:33.846076    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:33.849334    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:34.347376    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:34.347391    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:34.347398    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:34.347401    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:34.349645    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:34.848093    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:34.848113    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:34.848126    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:34.848134    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:34.851450    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:35.346386    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:35.346405    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:35.346416    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:35.346421    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:35.349660    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:35.349728    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:35.846776    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:35.846793    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:35.846799    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:35.846803    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:35.848988    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:36.348020    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:36.348045    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:36.348055    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:36.348061    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:36.351289    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:36.846442    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:36.846466    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:36.846478    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:36.846485    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:36.849727    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:37.346585    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:37.346598    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:37.346604    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:37.346608    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:37.348823    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:37.846395    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:37.846414    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:37.846425    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:37.846431    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:37.849318    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:37.849429    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:38.347018    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:38.347043    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:38.347055    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:38.347059    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:38.350460    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:38.847528    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:38.847544    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:38.847550    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:38.847554    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:38.849461    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:39.346721    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:39.346741    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:39.346752    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:39.346758    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:39.349742    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:39.846123    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:39.846146    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:39.846158    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:39.846164    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:39.849435    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:39.849503    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:40.346540    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:40.346552    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:40.346558    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:40.346560    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:40.348654    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:40.846152    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:40.846173    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:40.846184    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:40.846206    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:40.849347    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:41.346538    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:41.346550    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:41.346556    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:41.346560    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:41.348413    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:41.846620    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:41.846633    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:41.846639    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:41.846642    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:41.848943    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:42.347207    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:42.347233    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:42.347277    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:42.347287    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:42.350122    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:42.350199    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:42.846206    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:42.846231    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:42.846243    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:42.846251    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:42.849366    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:43.346675    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:43.346691    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:43.346724    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:43.346728    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:43.348764    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:43.846267    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:43.846289    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:43.846301    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:43.846306    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:43.849927    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:44.346504    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:44.346524    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:44.346532    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:44.346540    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:44.349860    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:44.847166    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:44.847180    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:44.847186    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:44.847193    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:44.849509    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:44.849569    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:45.347208    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:45.347222    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:45.347229    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:45.347232    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:45.349172    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:45.846510    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:45.846534    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:45.846545    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:45.846551    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:45.849782    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:46.346141    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:46.346158    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:46.346164    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:46.346167    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:46.347845    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:46.848226    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:46.848252    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:46.848263    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:46.848271    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:46.851712    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:46.851793    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:47.346279    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:47.346291    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:47.346297    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:47.346300    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:47.349863    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:47.847020    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:47.847037    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:47.847043    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:47.847046    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:47.848989    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:48.346969    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:48.346995    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:48.347053    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:48.347063    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:48.350507    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:48.847023    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:48.847043    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:48.847054    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:48.847060    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:48.850155    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:49.348069    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:49.348085    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:49.348091    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:49.348097    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:49.350031    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:49.350125    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:49.846786    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:49.846812    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:49.846834    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:49.846844    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:49.850238    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:50.347108    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:50.347128    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:50.347139    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:50.347144    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:50.350196    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:50.846164    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:50.846180    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:50.846186    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:50.846190    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:50.848092    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:51.347436    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:51.347460    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:51.347471    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:51.347477    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:51.351123    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:51.351195    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:51.847405    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:51.847419    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:51.847428    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:51.847433    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:51.849913    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:52.347071    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:52.347083    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:52.347093    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:52.347096    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:52.349220    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:52.847909    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:52.847932    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:52.847943    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:52.847951    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:52.851063    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:53.346206    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:53.346218    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:53.346224    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:53.346228    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:53.348204    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:53.847919    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:53.847935    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:53.847941    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:53.847945    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:53.849907    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:53.850011    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:54.347348    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:54.347369    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:54.347380    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:54.347385    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:54.351482    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:45:54.846431    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:54.846455    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:54.846466    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:54.846471    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:54.849557    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:55.348109    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:55.348121    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:55.348128    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:55.348131    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:55.350338    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:55.848148    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:55.848170    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:55.848181    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:55.848200    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:55.851241    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:55.851306    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:56.347660    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:56.347686    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:56.347697    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:56.347703    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:56.351077    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:56.847124    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:56.847140    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:56.847146    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:56.847159    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:56.849236    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:57.347401    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:57.347416    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:57.347444    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:57.347448    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:57.350002    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:57.847762    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:57.847778    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:57.847786    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:57.847792    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:57.849933    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:58.347634    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:58.347646    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:58.347652    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:58.347654    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:58.349839    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:58.349896    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:58.846561    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:58.846632    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:58.846645    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:58.846652    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:58.849247    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:59.347174    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:59.347196    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:59.347208    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:59.347215    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:59.350401    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:59.847088    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:59.847103    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:59.847119    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:59.847134    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:59.849352    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:00.347687    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:00.347714    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:00.347726    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:00.347734    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:00.351744    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:00.351819    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:00.848047    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:00.848068    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:00.848079    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:00.848086    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:00.851749    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:01.347871    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:01.347886    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:01.347895    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:01.347899    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:01.350037    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:01.847381    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:01.847403    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:01.847414    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:01.847423    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:01.850418    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:02.347961    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:02.347983    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:02.347992    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:02.347997    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:02.351704    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:02.351882    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:02.846644    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:02.846656    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:02.846663    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:02.846667    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:02.848618    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:03.346482    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:03.346503    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:03.346515    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:03.346522    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:03.349938    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:03.846526    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:03.846556    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:03.846616    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:03.846639    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:03.850171    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:04.346820    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:04.346836    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:04.346843    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:04.346860    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:04.349066    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:04.846842    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:04.846858    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:04.846868    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:04.846873    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:04.848643    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:04.848700    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:05.348383    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:05.348410    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:05.348423    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:05.348481    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:05.351822    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:05.846904    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:05.846917    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:05.846924    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:05.846927    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:05.848737    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:06.347363    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:06.347388    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:06.347426    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:06.347435    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:06.349807    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:06.846388    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:06.846402    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:06.846411    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:06.846417    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:06.848695    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:06.848754    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:07.346938    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:07.346964    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:07.346991    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:07.347032    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:07.350784    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:07.848381    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:07.848408    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:07.848425    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:07.848433    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:07.851814    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:08.348378    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:08.348403    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:08.348415    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:08.348420    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:08.351770    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:08.846356    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:08.846371    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:08.846377    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:08.846382    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:08.848517    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:09.346659    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:09.346686    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:09.346696    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:09.346705    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:09.349594    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:09.349709    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:09.846024    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:09.846037    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:09.846043    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:09.846047    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:09.847975    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:10.346809    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:10.346834    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:10.346845    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:10.346851    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:10.350256    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:10.844381    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:10.844403    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:10.844415    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:10.844422    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:10.847674    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:11.344377    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:11.344394    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:11.344400    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:11.344403    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:11.346485    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:11.843236    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:11.843247    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:11.843253    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:11.843257    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:11.845363    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:11.845422    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:12.343795    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:12.343813    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:12.343825    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:12.343840    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:12.347319    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:12.844111    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:12.844127    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:12.844133    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:12.844135    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:12.845879    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:13.343860    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:13.343887    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:13.343899    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:13.343904    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:13.347005    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:13.842634    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:13.842656    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:13.842668    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:13.842674    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:13.845855    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:13.845928    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:14.341496    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:14.341511    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:14.341518    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:14.341522    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:14.343436    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:14.841234    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:14.841255    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:14.841265    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:14.841270    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:14.844398    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:15.341763    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:15.341785    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:15.341796    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:15.341802    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:15.345605    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:15.840145    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:15.840161    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:15.840167    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:15.840170    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:15.842412    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:16.339596    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:16.339612    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:16.339621    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:16.339625    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:16.341841    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:16.341895    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:16.840537    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:16.840560    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:16.840580    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:16.840588    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:16.844162    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:17.339830    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:17.339847    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:17.339853    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:17.339862    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:17.341955    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:17.838709    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:17.838734    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:17.838745    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:17.838752    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:17.841971    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:18.339902    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:18.339925    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:18.339936    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:18.339942    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:18.343048    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:18.343121    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:18.837997    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:18.838010    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:18.838017    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:18.838020    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:18.842582    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:46:19.339010    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:19.339088    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:19.339099    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:19.339106    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:19.342495    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:19.839240    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:19.839263    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:19.839274    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:19.839283    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:19.842630    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:20.337822    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:20.337838    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:20.337846    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:20.337852    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:20.339893    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:20.838112    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:20.838140    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:20.838153    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:20.838160    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:20.841535    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:20.841611    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:21.336887    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:21.336902    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:21.336911    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:21.336915    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:21.339247    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:21.837400    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:21.837412    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:21.837416    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:21.837421    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:21.839410    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:22.337957    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:22.337984    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:22.338002    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:22.338016    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:22.341209    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:22.837636    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:22.837662    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:22.837673    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:22.837679    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:22.841366    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:22.841502    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:23.337276    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:23.337291    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:23.337304    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:23.337307    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:23.339521    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:23.836608    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:23.836631    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:23.836644    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:23.836651    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:23.839652    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:24.336381    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:24.336438    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:24.336453    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:24.336461    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:24.339223    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:24.834790    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:24.834810    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:24.834835    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:24.834839    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:24.837005    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:25.335102    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:25.335128    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:25.335139    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:25.335148    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:25.338326    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:25.338462    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:25.835276    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:25.835338    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:25.835361    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:25.835369    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:25.838385    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:26.334552    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:26.334565    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:26.334571    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:26.334574    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:26.336860    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:26.834506    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:26.834518    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:26.834524    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:26.834529    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:26.836177    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:27.334080    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:27.334107    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:27.334118    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:27.334125    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:27.337217    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:27.835003    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:27.835014    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:27.835020    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:27.835023    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:27.837029    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:27.837086    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:28.334519    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:28.334541    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:28.334554    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:28.334561    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:28.338051    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:28.834531    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:28.834552    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:28.834564    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:28.834570    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:28.837555    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:29.333171    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:29.333183    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:29.333190    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:29.333193    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:29.335112    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:29.833314    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:29.833337    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:29.833348    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:29.833354    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:29.836452    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:29.836529    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:30.334371    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:30.334430    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:30.334444    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:30.334452    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:30.337476    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:30.833481    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:30.833496    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:30.833502    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:30.833506    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:30.835694    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:31.333667    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:31.333787    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:31.333806    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:31.333812    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:31.337229    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:31.832937    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:31.832963    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:31.832976    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:31.832982    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:31.836197    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:31.836277    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:32.334027    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:32.334042    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:32.334049    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:32.334052    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:32.336000    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:32.832302    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:32.832329    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:32.832340    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:32.832349    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:32.835491    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:33.332732    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:33.332754    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:33.332765    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:33.332774    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:33.336007    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:33.832656    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:33.832672    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:33.832678    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:33.832681    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:33.836925    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:46:33.836986    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:34.332711    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:34.332735    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:34.332744    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:34.332748    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:34.336280    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:34.832778    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:34.832803    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:34.832815    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:34.832821    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:34.836052    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:35.331831    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:35.331847    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:35.331853    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:35.331855    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:35.333909    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:35.833174    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:35.833199    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:35.833210    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:35.833217    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:35.836522    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:35.836602    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:36.331760    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:36.331785    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:36.331797    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:36.331808    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:36.335187    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:36.831430    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:36.831443    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:36.831449    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:36.831452    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:36.833390    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:37.332076    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:37.332102    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:37.332113    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:37.332120    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:37.337064    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:46:37.831843    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:37.831865    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:37.831875    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:37.831882    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:37.834817    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:38.330953    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:38.330969    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:38.330996    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:38.331001    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:38.332836    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:38.332899    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:38.831091    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:38.831111    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:38.831134    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:38.831141    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:38.834085    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:39.330988    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:39.331010    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:39.331021    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:39.331030    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:39.334198    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:39.830708    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:39.830722    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:39.830728    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:39.830731    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:39.833084    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:40.331955    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:40.331978    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:40.331988    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:40.331995    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:40.335663    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:40.335827    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:40.831715    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:40.831736    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:40.831748    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:40.831753    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:40.834480    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:41.331801    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:41.331816    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:41.331824    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:41.331828    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:41.333947    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:41.830652    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:41.830674    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:41.830686    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:41.830692    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:41.833807    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:42.330632    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:42.330682    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:42.330694    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:42.330701    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:42.333713    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:42.830375    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:42.830390    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:42.830397    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:42.830400    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:42.832629    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:42.832686    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:43.330682    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:43.330708    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:43.330719    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:43.330725    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:43.333898    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:43.831092    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:43.831113    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:43.831125    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:43.831132    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:43.834043    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:44.331020    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:44.331035    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:44.331041    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:44.331044    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:44.333218    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:44.830357    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:44.830379    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:44.830390    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:44.830397    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:44.833640    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:44.833710    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:45.331564    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:45.331586    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:45.331598    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:45.331602    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:45.334717    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:45.830842    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:45.830857    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:45.830864    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:45.830868    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:45.832745    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:46.330292    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:46.330318    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:46.330330    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:46.330346    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:46.333844    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:46.830138    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:46.830164    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:46.830175    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:46.830183    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:46.833916    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:46.833987    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:47.330364    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:47.330380    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:47.330386    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:47.330389    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:47.332650    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:47.830666    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:47.830689    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:47.830701    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:47.830710    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:47.833714    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:48.330763    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:48.330784    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:48.330796    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:48.330804    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:48.334071    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:48.831187    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:48.831203    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:48.831209    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:48.831212    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:48.833347    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:49.330476    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:49.330500    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:49.330511    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:49.330517    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:49.333785    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:49.333851    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:49.831216    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:49.831242    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:49.831252    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:49.831272    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:49.834540    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:50.329535    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:50.329548    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:50.329554    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:50.329557    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:50.331810    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:50.829989    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:50.830011    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:50.830022    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:50.830030    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:50.833229    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:51.329962    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:51.329982    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:51.329998    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:51.330005    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:51.333236    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:51.830064    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:51.830077    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:51.830084    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:51.830087    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:51.832177    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:51.832239    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:52.330485    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:52.330510    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:52.330522    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:52.330528    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:52.334017    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:52.830400    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:52.830425    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:52.830436    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:52.830442    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:52.833770    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:53.329566    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:53.329579    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:53.329585    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:53.329589    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:53.331657    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:53.831320    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:53.831345    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:53.831357    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:53.831367    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:53.834615    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:53.834695    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:54.330494    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:54.330520    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:54.330537    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:54.330543    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:54.333826    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:54.830758    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:54.830774    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:54.830780    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:54.830783    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:54.832979    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:55.330573    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:55.330607    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:55.330642    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:55.330652    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:55.334018    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:55.830009    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:55.830030    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:55.830039    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:55.830045    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:55.833311    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:56.329121    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:56.329135    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:56.329150    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:56.329154    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:56.331151    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:56.331267    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:56.829636    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:56.829658    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:56.829676    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:56.829683    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:56.832997    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:57.330100    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:57.330164    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:57.330179    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:57.330185    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:57.332967    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:57.830447    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:57.830460    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:57.830466    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:57.830473    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:57.832494    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:58.330373    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:58.330394    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:58.330406    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:58.330411    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:58.333052    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:58.333119    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:58.829621    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:58.829634    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:58.829640    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:58.829644    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:58.832012    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:59.329472    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:59.329486    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:59.329493    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:59.329497    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:59.331476    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:59.828991    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:59.829004    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:59.829010    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:59.829013    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:59.832279    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:00.329603    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:00.329622    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:00.329633    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:00.329639    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:00.332733    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:00.830103    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:00.830116    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:00.830122    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:00.830125    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:00.837585    4003 round_trippers.go:574] Response Status: 404 Not Found in 7 milliseconds
	I0831 15:47:00.837647    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:01.330092    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:01.330112    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:01.330124    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:01.330132    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:01.333438    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:01.830117    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:01.830142    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:01.830152    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:01.830156    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:01.833106    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:02.330382    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:02.330398    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:02.330411    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:02.330415    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:02.332370    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:02.829065    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:02.829088    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:02.829101    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:02.829108    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:02.831924    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:03.330094    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:03.330120    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:03.330131    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:03.330136    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:03.333449    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:03.333526    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:03.830291    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:03.830308    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:03.830314    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:03.830317    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:03.832083    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:04.330231    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:04.330254    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:04.330289    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:04.330297    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:04.332924    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:04.829724    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:04.829747    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:04.829759    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:04.829767    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:04.833424    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:05.329231    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:05.329246    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:05.329253    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:05.329255    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:05.331317    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:05.828856    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:05.828876    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:05.828887    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:05.828893    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:05.831350    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:05.831420    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:06.329491    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:06.329514    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:06.329526    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:06.329535    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:06.332911    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:06.830113    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:06.830137    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:06.830167    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:06.830171    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:06.832311    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:07.328832    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:07.328852    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:07.328865    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:07.328872    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:07.331707    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:07.830142    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:07.830169    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:07.830210    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:07.830218    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:07.833304    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:07.833425    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:08.330192    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:08.330204    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:08.330211    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:08.330215    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:08.332216    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:08.829708    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:08.829721    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:08.829728    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:08.829731    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:08.832016    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:09.329901    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:09.329921    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:09.329934    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:09.329939    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:09.332962    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:09.829856    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:09.829869    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:09.829876    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:09.829879    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:09.831857    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:10.329372    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:10.329432    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:10.329446    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:10.329452    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:10.332160    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:10.332227    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:10.829229    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:10.829253    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:10.829265    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:10.829272    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:10.833374    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:47:11.330031    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:11.330047    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:11.330053    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:11.330057    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:11.332038    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:11.829331    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:11.829357    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:11.829372    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:11.829379    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:11.832648    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:12.329974    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:12.329989    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:12.329996    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:12.329999    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:12.332071    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:12.830134    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:12.830150    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:12.830156    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:12.830161    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:12.832336    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:12.832389    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:13.329321    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:13.329343    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:13.329353    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:13.329359    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:13.332441    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:13.828908    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:13.828931    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:13.828943    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:13.828950    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:13.832731    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:14.329733    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:14.329748    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:14.329755    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:14.329758    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:14.331982    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:14.830417    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:14.830445    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:14.830486    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:14.830493    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:14.833911    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:14.833990    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:15.328769    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:15.328790    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:15.328802    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:15.328812    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:15.331836    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:15.829268    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:15.829280    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:15.829286    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:15.829290    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:15.831233    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:16.329720    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:16.329739    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:16.329750    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:16.329758    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:16.332304    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:16.829209    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:16.829226    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:16.829234    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:16.829237    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:16.831627    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:17.330054    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:17.330070    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:17.330076    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:17.330079    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:17.332072    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:17.332162    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:17.829699    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:17.829721    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:17.829733    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:17.829738    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:17.833375    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:18.329515    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:18.329535    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:18.329546    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:18.329552    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:18.332114    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:18.829215    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:18.829228    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:18.829234    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:18.829237    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:18.831755    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:19.329707    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:19.329721    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:19.329728    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:19.329733    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:19.331565    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:19.830156    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:19.830177    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:19.830189    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:19.830198    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:19.833385    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:19.833450    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:20.328992    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:20.329004    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:20.329010    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:20.329014    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:20.331474    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:20.829297    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:20.829321    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:20.829332    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:20.829342    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:20.832512    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:21.329420    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:21.329442    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:21.329454    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:21.329460    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:21.332977    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:21.830340    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:21.830375    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:21.830384    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:21.830389    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:21.832344    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:22.330124    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:22.330146    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:22.330157    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:22.330164    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:22.332847    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:22.332923    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:22.829382    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:22.829408    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:22.829452    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:22.829461    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:22.832159    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:23.329407    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:23.329422    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:23.329429    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:23.329432    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:23.331410    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:23.829613    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:23.829636    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:23.829648    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:23.829654    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:23.832995    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:24.328868    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:24.328900    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:24.328966    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:24.328977    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:24.331905    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:24.829531    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:24.829552    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:24.829562    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:24.829567    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:24.832215    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:24.832290    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:25.329491    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:25.329512    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:25.329523    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:25.329531    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:25.332387    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:25.829129    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:25.829150    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:25.829161    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:25.829170    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:25.831914    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:26.329975    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:26.329998    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:26.330010    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:26.330016    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:26.332377    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:26.828755    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:26.828780    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:26.828794    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:26.828801    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:26.832060    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:27.328656    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:27.328678    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:27.328686    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:27.328696    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:27.332476    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:27.332537    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:27.829167    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:27.829178    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:27.829184    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:27.829187    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:27.830801    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:28.329337    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:28.329357    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:28.329368    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:28.329374    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:28.331877    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:28.828686    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:28.828709    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:28.828719    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:28.828725    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:28.831646    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:29.328641    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:29.328656    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:29.328666    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:29.328670    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:29.330609    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:29.828963    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:29.828978    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:29.828984    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:29.828988    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:29.831283    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:29.831335    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:30.329840    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:30.329859    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:30.329870    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:30.329876    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:30.332855    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:30.830461    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:30.830506    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:30.830516    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:30.830521    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:30.832580    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:31.330097    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:31.330110    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:31.330117    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:31.330120    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:31.331769    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:31.828676    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:31.828694    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:31.828706    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:31.828712    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:31.831715    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:31.831786    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:32.328645    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:32.328695    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:32.328704    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:32.328711    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:32.330855    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:32.830681    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:32.830701    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:32.830711    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:32.830717    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:32.833687    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:33.330045    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:33.330067    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:33.330080    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:33.330088    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:33.333035    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:33.829438    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:33.829470    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:33.829481    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:33.829486    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:33.832104    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:33.832209    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:34.329654    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:34.329677    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:34.329691    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:34.329700    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:34.332562    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:34.828622    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:34.828641    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:34.828652    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:34.828657    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:34.831130    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:35.328804    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:35.328825    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:35.328836    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:35.328843    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:35.331419    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:35.829296    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:35.829317    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:35.829329    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:35.829336    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:35.832744    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:35.832822    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:36.328855    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:36.328879    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:36.328890    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:36.328896    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:36.331894    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:36.828612    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:36.828632    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:36.828644    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:36.828650    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:36.831201    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:37.329040    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:37.329061    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:37.329076    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:37.329082    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:37.332359    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:37.828655    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:37.828667    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:37.828673    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:37.828676    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:37.830554    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:38.328877    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:38.328890    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:38.328896    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:38.328900    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:38.330918    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:38.330989    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:38.828965    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:38.828995    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:38.829051    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:38.829058    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:38.832125    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:39.329128    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:39.329186    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:39.329201    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:39.329209    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:39.332056    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:39.829820    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:39.829836    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:39.829842    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:39.829846    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:39.832218    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:40.328814    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:40.328863    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:40.328877    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:40.328883    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:40.331799    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:40.331871    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:40.829864    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:40.829888    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:40.829904    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:40.829911    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:40.832995    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:41.329765    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:41.329783    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:41.329792    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:41.329797    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:41.332227    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:41.830043    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:41.830062    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:41.830073    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:41.830079    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:41.833230    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:42.330723    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:42.330747    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:42.330787    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:42.330795    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:42.333941    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:42.334034    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:42.829938    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:42.829951    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:42.829957    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:42.829960    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:42.831505    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:43.329861    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:43.329884    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:43.329897    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:43.329903    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:43.333660    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:43.829116    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:43.829142    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:43.829154    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:43.829159    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:43.832310    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:44.330318    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:44.330336    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:44.330344    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:44.330350    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:44.332716    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:44.829324    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:44.829351    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:44.829363    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:44.829368    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:44.832857    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:44.832967    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:45.329358    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:45.329380    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:45.329391    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:45.329399    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:45.332784    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:45.829653    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:45.829668    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:45.829675    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:45.829679    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:45.831809    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:46.328769    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:46.328794    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:46.328807    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:46.328812    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:46.331758    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:46.829308    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:46.829333    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:46.829345    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:46.829350    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:46.832699    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:47.330622    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:47.330649    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:47.330701    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:47.330710    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:47.333673    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:47.333746    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:47.829700    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:47.829724    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:47.829735    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:47.829739    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:47.832430    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:48.329697    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:48.329719    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:48.329730    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:48.329739    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:48.333104    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:48.828962    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:48.828977    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:48.828986    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:48.828990    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:48.831389    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:49.329842    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:49.329867    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:49.329876    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:49.329883    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:49.332642    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:49.830281    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:49.830308    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:49.830319    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:49.830327    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:49.833684    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:49.833787    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:50.329816    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:50.329831    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:50.329837    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:50.329842    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:50.331764    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:50.829053    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:50.829076    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:50.829088    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:50.829095    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:50.832256    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:51.330225    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:51.330255    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:51.330270    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:51.330281    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:51.333711    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:51.829842    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:51.829861    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:51.829872    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:51.829878    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:51.832473    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:52.329568    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:52.329595    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:52.329606    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:52.329618    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:52.332450    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:52.332569    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:52.829778    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:52.829805    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:52.829816    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:52.829822    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:52.833363    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:53.329291    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:53.329306    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:53.329313    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:53.329317    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:53.331172    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:53.830270    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:53.830295    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:53.830306    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:53.830314    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:53.833837    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:54.330125    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:54.330151    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:54.330162    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:54.330168    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:54.333461    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:54.333541    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:54.829293    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:54.829321    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:54.829334    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:54.829341    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:54.832226    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:55.330712    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:55.330738    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:55.330749    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:55.330757    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:55.334141    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:55.828817    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:55.828872    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:55.828887    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:55.828895    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:55.831682    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:56.329544    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:56.329568    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:56.329580    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:56.329588    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:56.332148    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:56.830699    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:56.830725    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:56.830736    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:56.830743    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:56.834490    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:56.834565    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:57.329839    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:57.329861    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:57.329873    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:57.329878    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:57.333090    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:57.828829    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:57.828910    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:57.828916    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:57.828920    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:57.830711    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:58.328896    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:58.328923    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:58.328934    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:58.328940    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:58.332463    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:58.828817    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:58.828842    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:58.828854    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:58.828862    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:58.832243    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:59.330153    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:59.330177    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:59.330188    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:59.330193    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:59.333357    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:59.333454    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:59.830783    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:59.830807    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:59.830818    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:59.830876    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:59.834206    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:00.329131    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:00.329150    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:00.329159    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:00.329164    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:00.331350    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:00.830148    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:00.830172    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:00.830238    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:00.830248    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:00.832938    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:01.330744    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:01.330765    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:01.330776    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:01.330781    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:01.334219    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:01.334299    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:01.828849    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:01.828871    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:01.828882    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:01.828890    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:01.832151    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:02.329416    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:02.329435    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:02.329444    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:02.329448    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:02.332568    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:02.829933    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:02.829960    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:02.830044    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:02.830051    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:02.833123    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:03.328950    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:03.328972    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:03.328981    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:03.328989    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:03.331913    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:03.829379    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:03.829445    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:03.829462    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:03.829469    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:03.832420    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:03.832488    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:04.330783    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:04.330808    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:04.330819    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:04.330825    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:04.333835    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:04.828809    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:04.828835    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:04.828844    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:04.828852    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:04.832228    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:05.330083    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:05.330103    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:05.330115    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:05.330122    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:05.333301    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:05.829216    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:05.829239    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:05.829250    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:05.829257    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:05.832698    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:05.832773    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:06.329078    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:06.329103    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:06.329116    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:06.329123    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:06.332045    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:06.830238    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:06.830261    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:06.830306    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:06.830316    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:06.833538    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:07.330777    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:07.330798    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:07.330808    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:07.330815    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:07.334065    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:07.829264    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:07.829288    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:07.829330    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:07.829338    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:07.832368    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:08.329114    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:08.329178    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:08.329193    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:08.329211    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:08.332086    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:08.332156    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:08.829364    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:08.829385    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:08.829397    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:08.829404    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:08.832446    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:09.328860    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:09.328872    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:09.328878    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:09.328881    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:09.331153    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:09.829450    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:09.829472    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:09.829482    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:09.829490    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:09.839325    4003 round_trippers.go:574] Response Status: 404 Not Found in 9 milliseconds
	I0831 15:48:10.329202    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:10.329227    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:10.329290    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:10.329300    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:10.336072    4003 round_trippers.go:574] Response Status: 404 Not Found in 6 milliseconds
	I0831 15:48:10.336141    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:10.829298    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:10.829320    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:10.829333    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:10.829339    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:10.832656    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:11.328862    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:11.328879    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:11.328886    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:11.328890    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:11.331251    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:11.828789    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:11.828814    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:11.828825    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:11.828830    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:11.831875    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:12.329621    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:12.329641    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:12.329652    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:12.329657    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:12.332812    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:12.829177    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:12.829198    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:12.829209    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:12.829215    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:12.832205    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:12.832271    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:13.329690    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:13.329709    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:13.329721    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:13.329726    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:13.332350    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:13.830163    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:13.830187    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:13.830200    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:13.830207    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:13.833785    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:14.330813    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:14.330871    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:14.330889    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:14.330897    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:14.333729    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:14.829241    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:14.829256    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:14.829265    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:14.829271    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:14.831656    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:15.329102    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:15.329117    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:15.329125    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:15.329128    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:15.331035    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:48:15.331094    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:15.829453    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:15.829477    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:15.829490    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:15.829498    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:15.832921    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:16.330482    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:16.330501    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:16.330512    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:16.330519    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:16.333392    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:16.829494    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:16.829514    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:16.829526    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:16.829531    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:16.832666    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:17.328819    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:17.328832    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:17.328838    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:17.328842    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:17.332907    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:48:17.333002    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:17.830033    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:17.830052    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:17.830063    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:17.830071    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:17.833459    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:18.330056    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:18.330077    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:18.330089    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:18.330094    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:18.333155    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:18.830388    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:18.830402    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:18.830408    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:18.830411    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:18.832447    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:19.329634    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:19.329659    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:19.329671    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:19.329677    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:19.333012    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:19.333085    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:19.829599    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:19.829619    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:19.829631    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:19.829639    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:19.833057    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:20.330129    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:20.330145    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:20.330151    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:20.330154    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:20.331920    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:48:20.829042    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:20.829056    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:20.829065    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:20.829069    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:20.831640    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:21.330321    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:21.330345    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:21.330357    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:21.330364    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:21.333593    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:21.333742    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:21.829489    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:21.829509    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:21.829521    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:21.829528    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:21.832949    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:22.329074    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:22.329097    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:22.329109    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:22.329115    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:22.332552    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:22.829496    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:22.829514    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:22.829523    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:22.829528    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:22.831769    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:23.329638    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:23.329654    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:23.329662    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:23.329666    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:23.332063    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:23.830053    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:23.830067    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:23.830105    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:23.830115    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:23.832192    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:23.832251    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:24.329240    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:24.329260    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:24.329272    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:24.329277    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:24.332009    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:24.830470    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:24.830482    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:24.830488    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:24.830491    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:24.835168    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:48:25.330931    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:25.330957    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:25.330968    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:25.330974    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:25.334396    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:25.830021    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:25.830047    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:25.830057    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:25.830063    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:25.833612    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:25.833684    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:26.330695    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:26.330715    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:26.330726    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:26.330733    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:26.333858    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:26.829799    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:26.829824    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:26.829833    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:26.829838    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:26.833084    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:27.329417    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:27.329439    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:27.329450    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:27.329457    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:27.333005    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:27.829654    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:27.829674    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:27.829685    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:27.829693    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:27.832427    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:28.329524    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:28.329539    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:28.329580    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:28.329585    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:28.331632    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:28.331748    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:28.829893    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:28.829913    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:28.829925    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:28.829932    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:28.833039    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:29.329166    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:29.329185    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:29.329193    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:29.329197    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:29.331783    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:29.829024    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:29.829051    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:29.829062    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:29.829070    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:29.832264    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:30.328905    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:30.328931    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:30.328942    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:30.328947    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:30.332052    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:30.332123    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:30.830052    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:30.830072    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:30.830082    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:30.830091    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:30.833325    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:31.330324    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:31.330348    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:31.330360    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:31.330365    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:31.333570    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:31.830355    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:31.830379    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:31.830391    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:31.830448    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:31.833417    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:32.330044    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:32.330081    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:32.330090    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:32.330097    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:32.332188    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:32.332242    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:32.828972    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:32.828987    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:32.828994    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:32.828997    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:32.830746    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:48:33.330302    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:33.330324    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:33.330335    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:33.330342    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:33.333187    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:33.828871    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:33.828885    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:33.828891    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:33.828894    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:33.830679    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:48:34.329246    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:34.329269    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:34.329284    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:34.329293    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:34.332379    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:34.332447    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:34.828888    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:34.828903    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:34.828936    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:34.828941    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:34.836178    4003 round_trippers.go:574] Response Status: 404 Not Found in 7 milliseconds
	I0831 15:48:35.330611    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:35.330647    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:35.330655    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:35.330658    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:35.333046    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:35.829308    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:35.829333    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:35.829344    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:35.829352    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:35.832682    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:36.329920    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:36.329937    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:36.329976    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:36.329982    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:36.332428    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:36.332513    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:36.830494    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:36.830509    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:36.830515    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:36.830550    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:36.832561    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:37.329913    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:37.329933    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:37.329944    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:37.329949    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:37.332838    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:37.829024    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:37.829050    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:37.829062    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:37.829069    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:37.832669    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:38.330684    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:38.330699    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:38.330705    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:38.330708    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:38.332762    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:38.332823    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:38.829400    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:38.829426    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:38.829444    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:38.829450    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:38.832697    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:39.330303    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:39.330331    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:39.330342    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:39.330348    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:39.333360    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:39.829748    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:39.829768    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:39.829777    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:39.829781    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:39.832089    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:40.328868    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:40.328892    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:40.328903    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:40.328908    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:40.331956    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:40.829153    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:40.829180    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:40.829192    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:40.829199    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:40.832739    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:40.832818    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:41.330714    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:41.330729    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:41.330735    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:41.330738    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:41.332850    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:41.829181    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:41.829207    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:41.829217    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:41.829225    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:41.832653    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:42.330611    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:42.330634    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:42.330646    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:42.330655    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:42.334145    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:42.830611    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:42.830650    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:42.830658    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:42.830662    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:42.832630    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:48:43.329836    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:43.329858    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:43.329870    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:43.329877    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:43.333122    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:43.333193    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:43.829159    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:43.829183    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:43.829194    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:43.829200    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:43.832264    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:44.330509    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:44.330524    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:44.330531    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:44.330537    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:44.332882    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:44.829657    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:44.829680    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:44.829695    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:44.829700    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:44.832675    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:45.329175    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:45.329200    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:45.329211    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:45.329217    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:45.332400    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:45.829172    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:45.829184    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:45.829191    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:45.829194    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:45.831511    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:45.831573    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:46.329275    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:46.329302    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:46.329312    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:46.329318    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:46.332403    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:46.829488    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:46.829509    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:46.829521    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:46.829527    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:46.832727    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:47.329181    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:47.329197    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:47.329202    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:47.329205    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:47.331729    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:47.829140    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:47.829163    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:47.829175    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:47.829182    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:47.832590    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:47.832666    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:48.329582    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:48.329624    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:48.329632    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:48.329639    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:48.332262    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:48.829927    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:48.829940    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:48.829948    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:48.829951    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:48.832095    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:49.329030    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:49.329054    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:49.329067    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:49.329073    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:49.331713    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:49.829998    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:49.830024    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:49.830035    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:49.830042    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:49.833387    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:49.833457    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:50.329328    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:50.329345    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:50.329351    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:50.329355    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:50.331789    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:50.829290    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:50.829312    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:50.829323    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:50.829327    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:50.832450    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:51.329373    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:51.329396    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:51.329407    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:51.329413    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:51.332584    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:51.828974    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:51.828993    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:51.828999    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:51.829002    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:51.831143    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:52.329568    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:52.329582    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:52.329588    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:52.329591    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:52.331474    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:48:52.331532    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:52.828983    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:52.829009    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:52.829020    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:52.829027    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:52.831923    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:53.330254    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:53.330266    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:53.330272    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:53.330275    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:53.332376    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:53.829955    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:53.829977    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:53.829986    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:53.829991    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:53.833487    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:54.330025    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:54.330048    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:54.330058    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:54.330064    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:54.332846    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:54.332916    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:54.829445    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:54.829461    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:54.829469    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:54.829473    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:54.831681    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:55.330304    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:55.330329    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:55.330339    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:55.330343    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:55.333464    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:55.829335    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:55.829357    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:55.829372    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:55.829379    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:55.832747    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:56.329562    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:56.329574    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:56.329580    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:56.329583    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:56.331549    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:48:56.830534    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:56.830555    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:56.830566    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:56.830571    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:56.834033    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:56.834111    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:57.329183    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:57.329210    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:57.329220    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:57.329229    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:57.332084    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:57.829250    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:57.829263    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:57.829269    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:57.829273    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:57.831424    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:58.329091    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:58.329116    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:58.329126    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:58.329131    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:58.331697    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:58.831142    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:58.831167    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:58.831178    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:58.831185    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:58.834492    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:58.834557    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:59.329237    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:59.329252    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:59.329257    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:59.329261    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:59.331512    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:59.829320    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:59.829342    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:59.829353    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:59.829359    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:59.832197    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:00.330432    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:00.330451    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:00.330462    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:00.330471    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:00.333067    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:00.829237    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:00.829253    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:00.829260    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:00.829263    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:00.831418    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:01.329358    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:01.329379    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:01.329388    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:01.329393    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:01.332371    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:01.332438    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:49:01.830578    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:01.830604    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:01.830617    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:01.830623    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:01.834159    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:02.329157    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:02.329173    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:02.329179    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:02.329182    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:02.331067    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:49:02.831085    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:02.831112    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:02.831123    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:02.831130    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:02.834437    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:03.331085    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:03.331109    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:03.331152    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:03.331159    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:03.334347    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:03.334422    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:49:03.829836    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:03.829853    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:03.829859    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:03.829863    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:03.831902    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:04.331065    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:04.331089    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:04.331100    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:04.331107    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:04.334167    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:04.831234    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:04.831261    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:04.831273    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:04.831279    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:04.834602    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:05.330136    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:05.330151    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:05.330157    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:05.330160    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:05.332374    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:05.830128    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:05.830150    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:05.830165    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:05.830171    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:05.834152    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:05.834213    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:49:06.329879    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:06.329904    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:06.329915    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:06.329924    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:06.332822    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:06.829369    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:06.829385    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:06.829390    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:06.829393    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:06.831713    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:07.329339    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:07.329361    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:07.329373    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:07.329380    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:07.332647    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:07.830352    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:07.830380    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:07.830437    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:07.830448    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:07.833556    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:08.329058    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:08.329073    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:08.329079    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:08.329082    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:08.331089    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:49:08.331148    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:49:08.830337    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:08.830361    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:08.830372    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:08.830379    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:08.833728    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:08.833817    4003 node_ready.go:38] duration metric: took 4m0.004911985s for node "ha-949000-m04" to be "Ready" ...
	I0831 15:49:08.856471    4003 out.go:201] 
	W0831 15:49:08.878133    4003 out.go:270] X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	W0831 15:49:08.878147    4003 out.go:270] * 
	* 
	W0831 15:49:08.878920    4003 out.go:293] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0831 15:49:08.943376    4003 out.go:201] 

                                                
                                                
** /stderr **
ha_test.go:562: failed to start cluster. args "out/minikube-darwin-amd64 start -p ha-949000 --wait=true -v=7 --alsologtostderr --driver=hyperkit " : exit status 80
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-949000 -n ha-949000
helpers_test.go:245: <<< TestMultiControlPlane/serial/RestartCluster FAILED: start of post-mortem logs <<<
helpers_test.go:246: ======>  post-mortem[TestMultiControlPlane/serial/RestartCluster]: minikube logs <======
helpers_test.go:248: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 logs -n 25
helpers_test.go:248: (dbg) Done: out/minikube-darwin-amd64 -p ha-949000 logs -n 25: (3.549899806s)
helpers_test.go:253: TestMultiControlPlane/serial/RestartCluster logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5 -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- get pods -o          | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5 -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| node    | add -p ha-949000 -v=7                | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-949000 node stop m02 -v=7         | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:33 PDT | 31 Aug 24 15:33 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-949000 node start m02 -v=7        | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:34 PDT | 31 Aug 24 15:34 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-949000 -v=7               | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:35 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| stop    | -p ha-949000 -v=7                    | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:35 PDT | 31 Aug 24 15:36 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| start   | -p ha-949000 --wait=true -v=7        | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:36 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-949000                    | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:42 PDT |                     |
	| node    | ha-949000 node delete m03 -v=7       | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:42 PDT | 31 Aug 24 15:42 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| stop    | ha-949000 stop -v=7                  | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:42 PDT | 31 Aug 24 15:42 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| start   | -p ha-949000 --wait=true             | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:42 PDT |                     |
	|         | -v=7 --alsologtostderr               |           |         |         |                     |                     |
	|         | --driver=hyperkit                    |           |         |         |                     |                     |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/08/31 15:42:55
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0831 15:42:55.897896    4003 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:42:55.898177    4003 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:42:55.898183    4003 out.go:358] Setting ErrFile to fd 2...
	I0831 15:42:55.898187    4003 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:42:55.898378    4003 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:42:55.899837    4003 out.go:352] Setting JSON to false
	I0831 15:42:55.921901    4003 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":2546,"bootTime":1725141629,"procs":434,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.6.1","kernelVersion":"23.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0831 15:42:55.922001    4003 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0831 15:42:55.944577    4003 out.go:177] * [ha-949000] minikube v1.33.1 on Darwin 14.6.1
	I0831 15:42:55.987096    4003 out.go:177]   - MINIKUBE_LOCATION=18943
	I0831 15:42:55.987175    4003 notify.go:220] Checking for updates...
	I0831 15:42:56.029932    4003 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:42:56.050856    4003 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0831 15:42:56.072033    4003 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0831 15:42:56.093103    4003 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:42:56.114053    4003 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0831 15:42:56.135758    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:42:56.136428    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:56.136520    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:42:56.146197    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52047
	I0831 15:42:56.146589    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:42:56.146991    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:42:56.147003    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:42:56.147207    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:42:56.147336    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:42:56.147526    4003 driver.go:392] Setting default libvirt URI to qemu:///system
	I0831 15:42:56.147753    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:56.147780    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:42:56.156287    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52049
	I0831 15:42:56.156619    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:42:56.156971    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:42:56.156990    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:42:56.157191    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:42:56.157316    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:42:56.186031    4003 out.go:177] * Using the hyperkit driver based on existing profile
	I0831 15:42:56.227918    4003 start.go:297] selected driver: hyperkit
	I0831 15:42:56.227945    4003 start.go:901] validating driver "hyperkit" against &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime: ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false
ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirro
r: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:42:56.228199    4003 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0831 15:42:56.228401    4003 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:42:56.228599    4003 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18943-957/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0831 15:42:56.238336    4003 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0831 15:42:56.242056    4003 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:56.242078    4003 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0831 15:42:56.244705    4003 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:42:56.244747    4003 cni.go:84] Creating CNI manager for ""
	I0831 15:42:56.244753    4003 cni.go:136] multinode detected (3 nodes found), recommending kindnet
	I0831 15:42:56.244827    4003 start.go:340] cluster config:
	{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.16
9.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false
kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: S
ocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:42:56.244921    4003 iso.go:125] acquiring lock: {Name:mk6e91575b208577856769ef01f8e000bc57c787 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:42:56.286816    4003 out.go:177] * Starting "ha-949000" primary control-plane node in "ha-949000" cluster
	I0831 15:42:56.307847    4003 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:42:56.307937    4003 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4
	I0831 15:42:56.307963    4003 cache.go:56] Caching tarball of preloaded images
	I0831 15:42:56.308209    4003 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:42:56.308229    4003 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:42:56.308418    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:42:56.309323    4003 start.go:360] acquireMachinesLock for ha-949000: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:42:56.309437    4003 start.go:364] duration metric: took 90.572µs to acquireMachinesLock for "ha-949000"
	I0831 15:42:56.309468    4003 start.go:96] Skipping create...Using existing machine configuration
	I0831 15:42:56.309488    4003 fix.go:54] fixHost starting: 
	I0831 15:42:56.309922    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:56.309949    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:42:56.318888    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52051
	I0831 15:42:56.319241    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:42:56.319612    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:42:56.319626    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:42:56.319866    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:42:56.320016    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:42:56.320133    4003 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:42:56.320226    4003 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:42:56.320300    4003 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 3756
	I0831 15:42:56.321264    4003 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid 3756 missing from process table
	I0831 15:42:56.321288    4003 fix.go:112] recreateIfNeeded on ha-949000: state=Stopped err=<nil>
	I0831 15:42:56.321305    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	W0831 15:42:56.321391    4003 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 15:42:56.363717    4003 out.go:177] * Restarting existing hyperkit VM for "ha-949000" ...
	I0831 15:42:56.384899    4003 main.go:141] libmachine: (ha-949000) Calling .Start
	I0831 15:42:56.385294    4003 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:42:56.385370    4003 main.go:141] libmachine: (ha-949000) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid
	I0831 15:42:56.387089    4003 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid 3756 missing from process table
	I0831 15:42:56.387099    4003 main.go:141] libmachine: (ha-949000) DBG | pid 3756 is in state "Stopped"
	I0831 15:42:56.387115    4003 main.go:141] libmachine: (ha-949000) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid...
	I0831 15:42:56.387550    4003 main.go:141] libmachine: (ha-949000) DBG | Using UUID 98cab9ba-901d-49d1-9e6c-321a4533d56e
	I0831 15:42:56.496381    4003 main.go:141] libmachine: (ha-949000) DBG | Generated MAC ce:8:77:f7:42:5e
	I0831 15:42:56.496404    4003 main.go:141] libmachine: (ha-949000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:42:56.496533    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"98cab9ba-901d-49d1-9e6c-321a4533d56e", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003834d0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:42:56.496559    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"98cab9ba-901d-49d1-9e6c-321a4533d56e", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003834d0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:42:56.496621    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "98cab9ba-901d-49d1-9e6c-321a4533d56e", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd,earlyprintk=serial l
oglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:42:56.496665    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 98cab9ba-901d-49d1-9e6c-321a4533d56e -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset noresto
re waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:42:56.496684    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:42:56.498385    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 DEBUG: hyperkit: Pid is 4017
	I0831 15:42:56.498816    4003 main.go:141] libmachine: (ha-949000) DBG | Attempt 0
	I0831 15:42:56.498834    4003 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:42:56.498897    4003 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 4017
	I0831 15:42:56.500466    4003 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:42:56.500539    4003 main.go:141] libmachine: (ha-949000) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0831 15:42:56.500570    4003 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39c5e}
	I0831 15:42:56.500583    4003 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 15:42:56.500598    4003 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ec75}
	I0831 15:42:56.500613    4003 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4ec63}
	I0831 15:42:56.500643    4003 main.go:141] libmachine: (ha-949000) DBG | Found match: ce:8:77:f7:42:5e
	I0831 15:42:56.500654    4003 main.go:141] libmachine: (ha-949000) DBG | IP: 192.169.0.5
	I0831 15:42:56.500687    4003 main.go:141] libmachine: (ha-949000) Calling .GetConfigRaw
	I0831 15:42:56.501361    4003 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:42:56.501546    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:42:56.501931    4003 machine.go:93] provisionDockerMachine start ...
	I0831 15:42:56.501942    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:42:56.502103    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:42:56.502225    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:42:56.502347    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:42:56.502457    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:42:56.502550    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:42:56.502680    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:42:56.502894    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:42:56.502905    4003 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 15:42:56.506309    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:42:56.558516    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:42:56.559184    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:42:56.559207    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:42:56.559278    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:42:56.559308    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:42:56.940245    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:42:56.940260    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:42:57.055064    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:42:57.055080    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:42:57.055092    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:42:57.055101    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:42:57.056061    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:57 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:42:57.056073    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:57 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:43:02.655390    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:43:02 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0831 15:43:02.655429    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:43:02 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0831 15:43:02.655438    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:43:02 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0831 15:43:02.679403    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:43:02 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0831 15:43:07.568442    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 15:43:07.568456    4003 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:43:07.568651    4003 buildroot.go:166] provisioning hostname "ha-949000"
	I0831 15:43:07.568662    4003 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:43:07.568760    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:07.568847    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:07.568962    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:07.569093    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:07.569187    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:07.569365    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:07.569534    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:43:07.569549    4003 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000 && echo "ha-949000" | sudo tee /etc/hostname
	I0831 15:43:07.639291    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000
	
	I0831 15:43:07.639309    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:07.639436    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:07.639557    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:07.639638    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:07.639737    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:07.639874    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:07.640074    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:43:07.640086    4003 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:43:07.704134    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:43:07.704155    4003 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:43:07.704172    4003 buildroot.go:174] setting up certificates
	I0831 15:43:07.704178    4003 provision.go:84] configureAuth start
	I0831 15:43:07.704186    4003 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:43:07.704317    4003 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:43:07.704420    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:07.704522    4003 provision.go:143] copyHostCerts
	I0831 15:43:07.704550    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:43:07.704624    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:43:07.704632    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:43:07.704768    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:43:07.704971    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:43:07.705012    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:43:07.705016    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:43:07.705108    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:43:07.705254    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:43:07.705294    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:43:07.705299    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:43:07.705382    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:43:07.705569    4003 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000 san=[127.0.0.1 192.169.0.5 ha-949000 localhost minikube]
	I0831 15:43:07.906186    4003 provision.go:177] copyRemoteCerts
	I0831 15:43:07.906273    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:43:07.906312    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:07.906550    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:07.906738    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:07.906937    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:07.907046    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:43:07.944033    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:43:07.944107    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:43:07.963419    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:43:07.963482    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0831 15:43:07.982821    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:43:07.982884    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0831 15:43:08.001703    4003 provision.go:87] duration metric: took 297.505228ms to configureAuth
	I0831 15:43:08.001714    4003 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:43:08.001892    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:43:08.001909    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:08.002046    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:08.002137    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:08.002225    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:08.002306    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:08.002382    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:08.002501    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:08.002650    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:43:08.002659    4003 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:43:08.059324    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:43:08.059336    4003 buildroot.go:70] root file system type: tmpfs
	I0831 15:43:08.059403    4003 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:43:08.059416    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:08.059551    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:08.059659    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:08.059758    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:08.059843    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:08.059967    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:08.060104    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:43:08.060148    4003 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:43:08.127622    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:43:08.127643    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:08.127795    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:08.127885    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:08.127986    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:08.128093    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:08.128219    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:08.128373    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:43:08.128385    4003 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:43:09.818482    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:43:09.818495    4003 machine.go:96] duration metric: took 13.316412951s to provisionDockerMachine
	I0831 15:43:09.818507    4003 start.go:293] postStartSetup for "ha-949000" (driver="hyperkit")
	I0831 15:43:09.818514    4003 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:43:09.818524    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:09.818708    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:43:09.818733    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:09.818845    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:09.818952    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:09.819031    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:09.819124    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:43:09.856201    4003 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:43:09.861552    4003 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:43:09.861568    4003 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:43:09.861690    4003 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:43:09.861873    4003 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:43:09.861880    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:43:09.862086    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:43:09.873444    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:43:09.903949    4003 start.go:296] duration metric: took 85.422286ms for postStartSetup
	I0831 15:43:09.903973    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:09.904145    4003 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0831 15:43:09.904158    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:09.904244    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:09.904332    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:09.904406    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:09.904491    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:43:09.939732    4003 machine.go:197] restoring vm config from /var/lib/minikube/backup: [etc]
	I0831 15:43:09.939783    4003 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0831 15:43:09.973207    4003 fix.go:56] duration metric: took 13.663579156s for fixHost
	I0831 15:43:09.973228    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:09.973356    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:09.973449    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:09.973546    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:09.973619    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:09.973749    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:09.973922    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:43:09.973930    4003 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:43:10.027714    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725144190.095289778
	
	I0831 15:43:10.027726    4003 fix.go:216] guest clock: 1725144190.095289778
	I0831 15:43:10.027732    4003 fix.go:229] Guest: 2024-08-31 15:43:10.095289778 -0700 PDT Remote: 2024-08-31 15:43:09.973219 -0700 PDT m=+14.110517944 (delta=122.070778ms)
	I0831 15:43:10.027767    4003 fix.go:200] guest clock delta is within tolerance: 122.070778ms
	I0831 15:43:10.027774    4003 start.go:83] releasing machines lock for "ha-949000", held for 13.718178323s
	I0831 15:43:10.027798    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:10.027932    4003 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:43:10.028026    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:10.028324    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:10.028419    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:10.028500    4003 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:43:10.028533    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:10.028579    4003 ssh_runner.go:195] Run: cat /version.json
	I0831 15:43:10.028591    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:10.028629    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:10.028705    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:10.028719    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:10.028882    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:10.028892    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:10.028975    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:10.028990    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:43:10.029049    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:43:10.106178    4003 ssh_runner.go:195] Run: systemctl --version
	I0831 15:43:10.111111    4003 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0831 15:43:10.115308    4003 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:43:10.115344    4003 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:43:10.127805    4003 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:43:10.127827    4003 start.go:495] detecting cgroup driver to use...
	I0831 15:43:10.127920    4003 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:43:10.145626    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:43:10.154624    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:43:10.163250    4003 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:43:10.163290    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:43:10.172090    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:43:10.180802    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:43:10.189726    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:43:10.198477    4003 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:43:10.207531    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:43:10.216228    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:43:10.224957    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:43:10.233724    4003 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:43:10.241776    4003 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:43:10.249895    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:10.347162    4003 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:43:10.365744    4003 start.go:495] detecting cgroup driver to use...
	I0831 15:43:10.365818    4003 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:43:10.378577    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:43:10.391840    4003 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:43:10.407333    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:43:10.418578    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:43:10.428427    4003 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:43:10.447942    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:43:10.460400    4003 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:43:10.475459    4003 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:43:10.478281    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:43:10.485396    4003 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:43:10.498761    4003 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:43:10.593460    4003 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:43:10.696411    4003 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:43:10.696483    4003 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:43:10.710317    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:10.803031    4003 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:43:13.157366    4003 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.354290436s)
	I0831 15:43:13.157446    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:43:13.167970    4003 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:43:13.180929    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:43:13.191096    4003 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:43:13.293424    4003 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:43:13.392743    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:13.483508    4003 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:43:13.497374    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:43:13.508419    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:13.608347    4003 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:43:13.667376    4003 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:43:13.667470    4003 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:43:13.671956    4003 start.go:563] Will wait 60s for crictl version
	I0831 15:43:13.672004    4003 ssh_runner.go:195] Run: which crictl
	I0831 15:43:13.675617    4003 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:43:13.702050    4003 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:43:13.702122    4003 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:43:13.720302    4003 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:43:13.762901    4003 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:43:13.762952    4003 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:43:13.763326    4003 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:43:13.768068    4003 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:43:13.778798    4003 kubeadm.go:883] updating cluster {Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.
0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:f
alse inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOpt
imizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0831 15:43:13.778877    4003 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:43:13.778928    4003 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 15:43:13.792562    4003 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240813-c6f155d6
	registry.k8s.io/kube-apiserver:v1.31.0
	registry.k8s.io/kube-controller-manager:v1.31.0
	registry.k8s.io/kube-scheduler:v1.31.0
	registry.k8s.io/kube-proxy:v1.31.0
	registry.k8s.io/etcd:3.5.15-0
	registry.k8s.io/pause:3.10
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/coredns/coredns:v1.11.1
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0831 15:43:13.792576    4003 docker.go:615] Images already preloaded, skipping extraction
	I0831 15:43:13.792671    4003 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 15:43:13.806816    4003 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240813-c6f155d6
	registry.k8s.io/kube-controller-manager:v1.31.0
	registry.k8s.io/kube-apiserver:v1.31.0
	registry.k8s.io/kube-scheduler:v1.31.0
	registry.k8s.io/kube-proxy:v1.31.0
	registry.k8s.io/etcd:3.5.15-0
	registry.k8s.io/pause:3.10
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/coredns/coredns:v1.11.1
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0831 15:43:13.806831    4003 cache_images.go:84] Images are preloaded, skipping loading
	I0831 15:43:13.806842    4003 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.31.0 docker true true} ...
	I0831 15:43:13.806921    4003 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:43:13.806997    4003 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0831 15:43:13.845829    4003 cni.go:84] Creating CNI manager for ""
	I0831 15:43:13.845843    4003 cni.go:136] multinode detected (3 nodes found), recommending kindnet
	I0831 15:43:13.845854    4003 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0831 15:43:13.845869    4003 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.31.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-949000 NodeName:ha-949000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0831 15:43:13.845940    4003 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-949000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.31.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0831 15:43:13.845960    4003 kube-vip.go:115] generating kube-vip config ...
	I0831 15:43:13.846014    4003 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:43:13.859390    4003 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:43:13.859457    4003 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:43:13.859510    4003 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:43:13.867760    4003 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:43:13.867806    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0831 15:43:13.876386    4003 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0831 15:43:13.889628    4003 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:43:13.903120    4003 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0831 15:43:13.916765    4003 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:43:13.930236    4003 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:43:13.933264    4003 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:43:13.943217    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:14.038507    4003 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:43:14.052829    4003 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.5
	I0831 15:43:14.052841    4003 certs.go:194] generating shared ca certs ...
	I0831 15:43:14.052850    4003 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:43:14.053024    4003 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:43:14.053101    4003 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:43:14.053114    4003 certs.go:256] generating profile certs ...
	I0831 15:43:14.053197    4003 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:43:14.053222    4003 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.43b6ffe0
	I0831 15:43:14.053237    4003 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.43b6ffe0 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.254]
	I0831 15:43:14.128581    4003 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.43b6ffe0 ...
	I0831 15:43:14.128599    4003 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.43b6ffe0: {Name:mk00e438b52db2444ba8ce93d114dacf50fb7384 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:43:14.129258    4003 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.43b6ffe0 ...
	I0831 15:43:14.129272    4003 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.43b6ffe0: {Name:mkd10daf9fa17e10453b3bbf65f5132bb9bcd577 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:43:14.129503    4003 certs.go:381] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.43b6ffe0 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt
	I0831 15:43:14.129738    4003 certs.go:385] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.43b6ffe0 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key
	I0831 15:43:14.129977    4003 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:43:14.129987    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:43:14.130020    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:43:14.130040    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:43:14.130058    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:43:14.130075    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:43:14.130093    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:43:14.130110    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:43:14.130128    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:43:14.130233    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:43:14.130284    4003 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:43:14.130292    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:43:14.130322    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:43:14.130355    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:43:14.130384    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:43:14.130447    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:43:14.130483    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:14.130504    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:43:14.130522    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:43:14.131005    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:43:14.153234    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:43:14.186923    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:43:14.229049    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:43:14.284589    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1432 bytes)
	I0831 15:43:14.334141    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0831 15:43:14.385269    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:43:14.429545    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:43:14.461048    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:43:14.494719    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:43:14.523624    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:43:14.557563    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0831 15:43:14.571298    4003 ssh_runner.go:195] Run: openssl version
	I0831 15:43:14.575654    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:43:14.584028    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:43:14.587453    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:43:14.587495    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:43:14.591803    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:43:14.600098    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:43:14.608239    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:43:14.611660    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:43:14.611694    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:43:14.615930    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:43:14.624111    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:43:14.632509    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:14.636012    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:14.636052    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:14.640278    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:43:14.648758    4003 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:43:14.652057    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0831 15:43:14.656418    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0831 15:43:14.660743    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0831 15:43:14.665063    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0831 15:43:14.669321    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0831 15:43:14.673568    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0831 15:43:14.677784    4003 kubeadm.go:392] StartCluster: {Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 C
lusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:fals
e inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimi
zations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:43:14.677912    4003 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0831 15:43:14.690883    4003 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0831 15:43:14.698384    4003 kubeadm.go:408] found existing configuration files, will attempt cluster restart
	I0831 15:43:14.698396    4003 kubeadm.go:593] restartPrimaryControlPlane start ...
	I0831 15:43:14.698441    4003 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0831 15:43:14.706022    4003 kubeadm.go:130] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:43:14.706313    4003 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-949000" does not appear in /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:43:14.706401    4003 kubeconfig.go:62] /Users/jenkins/minikube-integration/18943-957/kubeconfig needs updating (will repair): [kubeconfig missing "ha-949000" cluster setting kubeconfig missing "ha-949000" context setting]
	I0831 15:43:14.706628    4003 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/kubeconfig: {Name:mkc7259a3f17d77b84078e55eed4ed8b5d2486ca Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:43:14.707280    4003 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:43:14.707484    4003 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.5:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, Use
rAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52edc00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0831 15:43:14.707808    4003 cert_rotation.go:140] Starting client certificate rotation controller
	I0831 15:43:14.707985    4003 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0831 15:43:14.715222    4003 kubeadm.go:630] The running cluster does not require reconfiguration: 192.169.0.5
	I0831 15:43:14.715234    4003 kubeadm.go:597] duration metric: took 16.834195ms to restartPrimaryControlPlane
	I0831 15:43:14.715240    4003 kubeadm.go:394] duration metric: took 37.459181ms to StartCluster
	I0831 15:43:14.715249    4003 settings.go:142] acquiring lock: {Name:mk4b1b0a7439feab82be8f6d66b4d3c4d11c9b5f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:43:14.715327    4003 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:43:14.715694    4003 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/kubeconfig: {Name:mkc7259a3f17d77b84078e55eed4ed8b5d2486ca Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:43:14.715917    4003 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:43:14.715930    4003 start.go:241] waiting for startup goroutines ...
	I0831 15:43:14.715938    4003 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0831 15:43:14.716058    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:43:14.761177    4003 out.go:177] * Enabled addons: 
	I0831 15:43:14.783218    4003 addons.go:510] duration metric: took 67.285233ms for enable addons: enabled=[]
	I0831 15:43:14.783269    4003 start.go:246] waiting for cluster config update ...
	I0831 15:43:14.783281    4003 start.go:255] writing updated cluster config ...
	I0831 15:43:14.806130    4003 out.go:201] 
	I0831 15:43:14.827581    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:43:14.827719    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:43:14.850202    4003 out.go:177] * Starting "ha-949000-m02" control-plane node in "ha-949000" cluster
	I0831 15:43:14.892085    4003 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:43:14.892153    4003 cache.go:56] Caching tarball of preloaded images
	I0831 15:43:14.892329    4003 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:43:14.892347    4003 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:43:14.892479    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:43:14.893510    4003 start.go:360] acquireMachinesLock for ha-949000-m02: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:43:14.893615    4003 start.go:364] duration metric: took 79.031µs to acquireMachinesLock for "ha-949000-m02"
	I0831 15:43:14.893640    4003 start.go:96] Skipping create...Using existing machine configuration
	I0831 15:43:14.893648    4003 fix.go:54] fixHost starting: m02
	I0831 15:43:14.894056    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:43:14.894083    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:43:14.903465    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52073
	I0831 15:43:14.903886    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:43:14.904288    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:43:14.904300    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:43:14.904593    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:43:14.904763    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:14.904931    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:43:14.905038    4003 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:43:14.905115    4003 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 3763
	I0831 15:43:14.906087    4003 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid 3763 missing from process table
	I0831 15:43:14.906133    4003 fix.go:112] recreateIfNeeded on ha-949000-m02: state=Stopped err=<nil>
	I0831 15:43:14.906161    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	W0831 15:43:14.906324    4003 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 15:43:14.949174    4003 out.go:177] * Restarting existing hyperkit VM for "ha-949000-m02" ...
	I0831 15:43:14.970157    4003 main.go:141] libmachine: (ha-949000-m02) Calling .Start
	I0831 15:43:14.970435    4003 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:43:14.970489    4003 main.go:141] libmachine: (ha-949000-m02) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid
	I0831 15:43:14.972233    4003 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid 3763 missing from process table
	I0831 15:43:14.972246    4003 main.go:141] libmachine: (ha-949000-m02) DBG | pid 3763 is in state "Stopped"
	I0831 15:43:14.972295    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid...
	I0831 15:43:14.972683    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Using UUID 23e5d675-5201-4f3d-86b7-b25c818528d1
	I0831 15:43:14.998998    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Generated MAC 92:7:3c:3f:ee:b7
	I0831 15:43:14.999027    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:43:14.999117    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:14 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"23e5d675-5201-4f3d-86b7-b25c818528d1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003bea80)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:43:14.999143    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:14 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"23e5d675-5201-4f3d-86b7-b25c818528d1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003bea80)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:43:14.999177    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:14 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "23e5d675-5201-4f3d-86b7-b25c818528d1", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:43:14.999231    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:14 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 23e5d675-5201-4f3d-86b7-b25c818528d1 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:43:14.999254    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:14 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:43:15.000658    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 DEBUG: hyperkit: Pid is 4035
	I0831 15:43:15.001119    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 0
	I0831 15:43:15.001129    4003 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:43:15.001211    4003 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 4035
	I0831 15:43:15.003022    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:43:15.003110    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0831 15:43:15.003135    4003 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 15:43:15.003157    4003 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39c5e}
	I0831 15:43:15.003193    4003 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 15:43:15.003213    4003 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ec75}
	I0831 15:43:15.003221    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetConfigRaw
	I0831 15:43:15.003228    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Found match: 92:7:3c:3f:ee:b7
	I0831 15:43:15.003263    4003 main.go:141] libmachine: (ha-949000-m02) DBG | IP: 192.169.0.6
	I0831 15:43:15.003898    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:43:15.004131    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:43:15.004587    4003 machine.go:93] provisionDockerMachine start ...
	I0831 15:43:15.004598    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:15.004713    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:15.004819    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:15.004915    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:15.005012    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:15.005089    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:15.005222    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:15.005366    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:43:15.005373    4003 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 15:43:15.008900    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:43:15.017748    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:43:15.018656    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:43:15.018679    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:43:15.018711    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:43:15.018731    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:43:15.399794    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:43:15.399810    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:43:15.514263    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:43:15.514282    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:43:15.514290    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:43:15.514296    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:43:15.515095    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:43:15.515105    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:43:21.084857    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:21 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:43:21.085024    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:21 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:43:21.085033    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:21 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:43:21.108855    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:21 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:43:50.068778    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 15:43:50.068792    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:43:50.068914    4003 buildroot.go:166] provisioning hostname "ha-949000-m02"
	I0831 15:43:50.068926    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:43:50.069013    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:50.069099    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:50.069176    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.069263    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.069336    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:50.069474    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:50.069630    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:43:50.069640    4003 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m02 && echo "ha-949000-m02" | sudo tee /etc/hostname
	I0831 15:43:50.130987    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m02
	
	I0831 15:43:50.131001    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:50.131142    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:50.131239    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.131330    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.131429    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:50.131565    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:50.131704    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:43:50.131716    4003 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:43:50.189171    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:43:50.189186    4003 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:43:50.189202    4003 buildroot.go:174] setting up certificates
	I0831 15:43:50.189208    4003 provision.go:84] configureAuth start
	I0831 15:43:50.189215    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:43:50.189354    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:43:50.189440    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:50.189529    4003 provision.go:143] copyHostCerts
	I0831 15:43:50.189563    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:43:50.189610    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:43:50.189616    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:43:50.189739    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:43:50.189940    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:43:50.189969    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:43:50.189973    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:43:50.190084    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:43:50.190251    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:43:50.190286    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:43:50.190291    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:43:50.190364    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:43:50.190554    4003 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m02 san=[127.0.0.1 192.169.0.6 ha-949000-m02 localhost minikube]
	I0831 15:43:50.447994    4003 provision.go:177] copyRemoteCerts
	I0831 15:43:50.448048    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:43:50.448062    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:50.448197    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:50.448289    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.448376    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:50.448469    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:43:50.481386    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:43:50.481457    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:43:50.500479    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:43:50.500539    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:43:50.519580    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:43:50.519638    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0831 15:43:50.538582    4003 provision.go:87] duration metric: took 349.361412ms to configureAuth
	I0831 15:43:50.538594    4003 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:43:50.538767    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:43:50.538781    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:50.538915    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:50.539010    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:50.539090    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.539170    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.539253    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:50.539350    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:50.539469    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:43:50.539477    4003 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:43:50.589461    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:43:50.589472    4003 buildroot.go:70] root file system type: tmpfs
	I0831 15:43:50.589565    4003 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:43:50.589575    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:50.589709    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:50.589808    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.589904    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.589997    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:50.590114    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:50.590247    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:43:50.590295    4003 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:43:50.650656    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:43:50.650675    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:50.650817    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:50.650904    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.650975    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.651066    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:50.651189    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:50.651328    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:43:50.651340    4003 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:43:52.319769    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:43:52.319783    4003 machine.go:96] duration metric: took 37.314787706s to provisionDockerMachine
	I0831 15:43:52.319791    4003 start.go:293] postStartSetup for "ha-949000-m02" (driver="hyperkit")
	I0831 15:43:52.319799    4003 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:43:52.319809    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:52.319999    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:43:52.320012    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:52.320113    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:52.320206    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:52.320293    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:52.320379    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:43:52.352031    4003 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:43:52.355233    4003 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:43:52.355244    4003 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:43:52.355330    4003 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:43:52.355466    4003 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:43:52.355473    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:43:52.355627    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:43:52.362886    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:43:52.382898    4003 start.go:296] duration metric: took 63.098255ms for postStartSetup
	I0831 15:43:52.382918    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:52.383098    4003 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0831 15:43:52.383110    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:52.383181    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:52.383271    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:52.383354    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:52.383436    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:43:52.415810    4003 machine.go:197] restoring vm config from /var/lib/minikube/backup: [etc]
	I0831 15:43:52.415864    4003 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0831 15:43:52.449230    4003 fix.go:56] duration metric: took 37.555176154s for fixHost
	I0831 15:43:52.449254    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:52.449385    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:52.449479    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:52.449570    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:52.449656    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:52.449784    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:52.449933    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:43:52.449941    4003 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:43:52.500604    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725144232.566642995
	
	I0831 15:43:52.500618    4003 fix.go:216] guest clock: 1725144232.566642995
	I0831 15:43:52.500629    4003 fix.go:229] Guest: 2024-08-31 15:43:52.566642995 -0700 PDT Remote: 2024-08-31 15:43:52.449243 -0700 PDT m=+56.586086649 (delta=117.399995ms)
	I0831 15:43:52.500641    4003 fix.go:200] guest clock delta is within tolerance: 117.399995ms
	I0831 15:43:52.500644    4003 start.go:83] releasing machines lock for "ha-949000-m02", held for 37.60661602s
	I0831 15:43:52.500661    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:52.500790    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:43:52.524083    4003 out.go:177] * Found network options:
	I0831 15:43:52.545377    4003 out.go:177]   - NO_PROXY=192.169.0.5
	W0831 15:43:52.567312    4003 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:43:52.567341    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:52.567964    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:52.568161    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:52.568240    4003 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:43:52.568275    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	W0831 15:43:52.568384    4003 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:43:52.568419    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:52.568477    4003 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:43:52.568494    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:52.568580    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:52.568637    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:52.568715    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:52.568763    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:52.568895    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:52.568930    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:43:52.569064    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	W0831 15:43:52.598238    4003 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:43:52.598301    4003 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:43:52.641479    4003 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:43:52.641502    4003 start.go:495] detecting cgroup driver to use...
	I0831 15:43:52.641620    4003 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:43:52.657762    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:43:52.666682    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:43:52.675584    4003 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:43:52.675632    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:43:52.684590    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:43:52.693450    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:43:52.702203    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:43:52.711110    4003 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:43:52.720178    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:43:52.729030    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:43:52.738456    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:43:52.748149    4003 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:43:52.756790    4003 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:43:52.765391    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:52.862859    4003 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:43:52.883299    4003 start.go:495] detecting cgroup driver to use...
	I0831 15:43:52.883366    4003 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:43:52.900841    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:43:52.911689    4003 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:43:52.925373    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:43:52.936790    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:43:52.947768    4003 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:43:52.969807    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:43:52.980241    4003 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:43:52.995125    4003 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:43:52.998026    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:43:53.005290    4003 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:43:53.018832    4003 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:43:53.124064    4003 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:43:53.226798    4003 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:43:53.226820    4003 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:43:53.241337    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:53.342509    4003 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:43:55.695532    4003 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.352978813s)
	I0831 15:43:55.695593    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:43:55.706164    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:43:55.716443    4003 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:43:55.813069    4003 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:43:55.914225    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:56.017829    4003 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:43:56.031977    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:43:56.043082    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:56.147482    4003 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:43:56.211631    4003 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:43:56.211708    4003 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:43:56.216202    4003 start.go:563] Will wait 60s for crictl version
	I0831 15:43:56.216251    4003 ssh_runner.go:195] Run: which crictl
	I0831 15:43:56.223176    4003 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:43:56.247497    4003 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:43:56.247568    4003 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:43:56.264978    4003 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:43:56.322638    4003 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:43:56.344590    4003 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:43:56.365748    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:43:56.366152    4003 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:43:56.370681    4003 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:43:56.380351    4003 mustload.go:65] Loading cluster: ha-949000
	I0831 15:43:56.380517    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:43:56.380743    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:43:56.380758    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:43:56.389551    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52095
	I0831 15:43:56.390006    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:43:56.390330    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:43:56.390341    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:43:56.390567    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:43:56.390683    4003 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:43:56.390760    4003 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:43:56.390827    4003 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 4017
	I0831 15:43:56.391784    4003 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:43:56.392030    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:43:56.392047    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:43:56.400646    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52097
	I0831 15:43:56.401071    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:43:56.401432    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:43:56.401449    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:43:56.401654    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:43:56.401763    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:56.401863    4003 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.6
	I0831 15:43:56.401868    4003 certs.go:194] generating shared ca certs ...
	I0831 15:43:56.401876    4003 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:43:56.402014    4003 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:43:56.402069    4003 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:43:56.402077    4003 certs.go:256] generating profile certs ...
	I0831 15:43:56.402165    4003 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:43:56.402256    4003 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.2cd83952
	I0831 15:43:56.402304    4003 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:43:56.402311    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:43:56.402331    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:43:56.402351    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:43:56.402368    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:43:56.402387    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:43:56.402405    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:43:56.402427    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:43:56.402445    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:43:56.402522    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:43:56.402560    4003 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:43:56.402572    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:43:56.402605    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:43:56.402639    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:43:56.402671    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:43:56.402737    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:43:56.402769    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:56.402811    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:43:56.402831    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:43:56.402857    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:56.402948    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:56.403031    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:56.403124    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:56.403213    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:43:56.428694    4003 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0831 15:43:56.431875    4003 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0831 15:43:56.440490    4003 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0831 15:43:56.443670    4003 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0831 15:43:56.452165    4003 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0831 15:43:56.455207    4003 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0831 15:43:56.463624    4003 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0831 15:43:56.466671    4003 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1679 bytes)
	I0831 15:43:56.475535    4003 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0831 15:43:56.478615    4003 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0831 15:43:56.487110    4003 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0831 15:43:56.490238    4003 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0831 15:43:56.498895    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:43:56.519238    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:43:56.539011    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:43:56.558598    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:43:56.578234    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1432 bytes)
	I0831 15:43:56.597888    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0831 15:43:56.617519    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:43:56.637284    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:43:56.657084    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:43:56.676448    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:43:56.696310    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:43:56.715741    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0831 15:43:56.729513    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0831 15:43:56.743001    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0831 15:43:56.756453    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1679 bytes)
	I0831 15:43:56.770115    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0831 15:43:56.784073    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0831 15:43:56.797658    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0831 15:43:56.810908    4003 ssh_runner.go:195] Run: openssl version
	I0831 15:43:56.815001    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:43:56.823241    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:56.826641    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:56.826682    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:56.830949    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:43:56.839331    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:43:56.847777    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:43:56.851154    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:43:56.851190    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:43:56.855448    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:43:56.863829    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:43:56.872178    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:43:56.875731    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:43:56.875765    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:43:56.879995    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:43:56.888471    4003 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:43:56.892039    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0831 15:43:56.896510    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0831 15:43:56.900794    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0831 15:43:56.904975    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0831 15:43:56.909175    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0831 15:43:56.913367    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0831 15:43:56.917519    4003 kubeadm.go:934] updating node {m02 192.169.0.6 8443 v1.31.0 docker true true} ...
	I0831 15:43:56.917575    4003 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.6
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:43:56.917596    4003 kube-vip.go:115] generating kube-vip config ...
	I0831 15:43:56.917626    4003 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:43:56.929983    4003 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:43:56.930030    4003 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:43:56.930087    4003 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:43:56.938650    4003 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:43:56.938693    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0831 15:43:56.948188    4003 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:43:56.962082    4003 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:43:56.975374    4003 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:43:56.989089    4003 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:43:56.991924    4003 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:43:57.001250    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:57.094190    4003 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:43:57.108747    4003 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:43:57.108933    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:43:57.130218    4003 out.go:177] * Verifying Kubernetes components...
	I0831 15:43:57.171663    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:57.293447    4003 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:43:57.304999    4003 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:43:57.305203    4003 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52edc00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:43:57.305240    4003 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:43:57.305411    4003 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m02" to be "Ready" ...
	I0831 15:43:57.305492    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:43:57.305497    4003 round_trippers.go:469] Request Headers:
	I0831 15:43:57.305505    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:43:57.305514    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:05.959710    4003 round_trippers.go:574] Response Status: 200 OK in 8654 milliseconds
	I0831 15:44:05.960438    4003 node_ready.go:49] node "ha-949000-m02" has status "Ready":"True"
	I0831 15:44:05.960449    4003 node_ready.go:38] duration metric: took 8.6549293s for node "ha-949000-m02" to be "Ready" ...
	I0831 15:44:05.960456    4003 pod_ready.go:36] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:44:05.960491    4003 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I0831 15:44:05.960499    4003 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I0831 15:44:05.960533    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:44:05.960537    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:05.960545    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:05.960552    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:05.970871    4003 round_trippers.go:574] Response Status: 200 OK in 10 milliseconds
	I0831 15:44:05.978345    4003 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:05.978408    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:44:05.978422    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:05.978429    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:05.978433    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:05.985369    4003 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:44:05.985815    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:05.985824    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:05.985830    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:05.985833    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:05.991184    4003 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0831 15:44:05.991513    4003 pod_ready.go:93] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:05.991523    4003 pod_ready.go:82] duration metric: took 13.160164ms for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:05.991530    4003 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:05.991572    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-snq8s
	I0831 15:44:05.991577    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:05.991582    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:05.991587    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.000332    4003 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0831 15:44:06.000855    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:06.000863    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.000872    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.000878    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.013265    4003 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0831 15:44:06.013530    4003 pod_ready.go:93] pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:06.013539    4003 pod_ready.go:82] duration metric: took 22.004461ms for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.013546    4003 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.013590    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000
	I0831 15:44:06.013595    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.013601    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.013603    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.020268    4003 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:44:06.020643    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:06.020651    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.020657    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.020661    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.027711    4003 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0831 15:44:06.028254    4003 pod_ready.go:93] pod "etcd-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:06.028264    4003 pod_ready.go:82] duration metric: took 14.711969ms for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.028272    4003 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.028311    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m02
	I0831 15:44:06.028316    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.028322    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.028326    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.039178    4003 round_trippers.go:574] Response Status: 200 OK in 10 milliseconds
	I0831 15:44:06.039603    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:06.039612    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.039618    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.039621    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.041381    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:06.041651    4003 pod_ready.go:93] pod "etcd-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:06.041661    4003 pod_ready.go:82] duration metric: took 13.384756ms for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.041667    4003 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.041704    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m03
	I0831 15:44:06.041709    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.041715    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.041718    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.043280    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:06.161143    4003 request.go:632] Waited for 117.478694ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:06.161211    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:06.161216    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.161222    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.161225    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.165879    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:44:06.166023    4003 pod_ready.go:98] node "ha-949000-m03" hosting pod "etcd-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:06.166034    4003 pod_ready.go:82] duration metric: took 124.360492ms for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	E0831 15:44:06.166042    4003 pod_ready.go:67] WaitExtra: waitPodCondition: node "ha-949000-m03" hosting pod "etcd-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:06.166052    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.361793    4003 request.go:632] Waited for 195.664438ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:44:06.361828    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:44:06.361833    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.361839    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.361847    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.363761    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:06.561193    4003 request.go:632] Waited for 196.830957ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:06.561252    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:06.561266    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.561279    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.561292    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.564567    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:06.565116    4003 pod_ready.go:93] pod "kube-apiserver-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:06.565128    4003 pod_ready.go:82] duration metric: took 399.063144ms for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.565137    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.761258    4003 request.go:632] Waited for 195.975667ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:44:06.761325    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:44:06.761334    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.761351    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.761363    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.764874    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:06.960633    4003 request.go:632] Waited for 195.219559ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:06.960666    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:06.960695    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.960702    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.960706    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.966407    4003 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0831 15:44:06.966698    4003 pod_ready.go:93] pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:06.966707    4003 pod_ready.go:82] duration metric: took 401.560896ms for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.966714    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:07.161478    4003 request.go:632] Waited for 194.704872ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:44:07.161625    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:44:07.161636    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:07.161647    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:07.161656    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:07.165538    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:07.361967    4003 request.go:632] Waited for 195.95763ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:07.362001    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:07.362006    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:07.362012    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:07.362016    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:07.363942    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:44:07.364015    4003 pod_ready.go:98] node "ha-949000-m03" hosting pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:07.364027    4003 pod_ready.go:82] duration metric: took 397.303245ms for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	E0831 15:44:07.364034    4003 pod_ready.go:67] WaitExtra: waitPodCondition: node "ha-949000-m03" hosting pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:07.364047    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:07.561375    4003 request.go:632] Waited for 197.282382ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:44:07.561418    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:44:07.561424    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:07.561430    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:07.561434    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:07.563374    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:07.761585    4003 request.go:632] Waited for 197.505917ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:07.761680    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:07.761692    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:07.761703    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:07.761710    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:07.765076    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:07.765411    4003 pod_ready.go:93] pod "kube-controller-manager-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:07.765423    4003 pod_ready.go:82] duration metric: took 401.363562ms for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:07.765432    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:07.961150    4003 request.go:632] Waited for 195.676394ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:44:07.961210    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:44:07.961216    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:07.961223    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:07.961232    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:07.963936    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:08.160774    4003 request.go:632] Waited for 196.46087ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:08.160847    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:08.160855    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:08.160863    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:08.160885    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:08.163147    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:08.163737    4003 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:08.163748    4003 pod_ready.go:82] duration metric: took 398.305248ms for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:08.163756    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:08.360946    4003 request.go:632] Waited for 197.148459ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:44:08.361013    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:44:08.361018    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:08.361025    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:08.361030    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:08.363306    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:08.561349    4003 request.go:632] Waited for 197.594231ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:08.561505    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:08.561518    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:08.561529    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:08.561536    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:08.564572    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:44:08.564661    4003 pod_ready.go:98] node "ha-949000-m03" hosting pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:08.564674    4003 pod_ready.go:82] duration metric: took 400.906717ms for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	E0831 15:44:08.564683    4003 pod_ready.go:67] WaitExtra: waitPodCondition: node "ha-949000-m03" hosting pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:08.564694    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:08.760636    4003 request.go:632] Waited for 195.893531ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:08.760715    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:08.760720    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:08.760726    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:08.760729    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:08.763646    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:08.961865    4003 request.go:632] Waited for 197.701917ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:08.961922    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:08.961933    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:08.961945    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:08.961952    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:08.964688    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:09.160991    4003 request.go:632] Waited for 95.682906ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:09.161056    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:09.161066    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:09.161078    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:09.161088    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:09.164212    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:09.360988    4003 request.go:632] Waited for 196.217621ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:09.361022    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:09.361027    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:09.361055    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:09.361059    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:09.363713    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:09.564888    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:09.564900    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:09.564907    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:09.564913    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:09.568623    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:09.760895    4003 request.go:632] Waited for 191.666981ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:09.760944    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:09.760952    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:09.760958    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:09.760962    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:09.763257    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:10.065958    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:10.065982    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:10.065993    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:10.065998    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:10.069180    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:10.162666    4003 request.go:632] Waited for 93.035977ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:10.162750    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:10.162767    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:10.162780    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:10.162786    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:10.165653    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:10.565356    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:10.565380    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:10.565391    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:10.565397    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:10.568883    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:10.569642    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:10.569650    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:10.569655    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:10.569658    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:10.571069    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:10.571366    4003 pod_ready.go:103] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"False"
	I0831 15:44:11.066968    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:11.066994    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:11.067006    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:11.067011    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:11.070763    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:11.071322    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:11.071330    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:11.071335    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:11.071339    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:11.072824    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:11.565282    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:11.565303    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:11.565314    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:11.565320    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:11.568672    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:11.569364    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:11.569371    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:11.569378    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:11.569381    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:11.571110    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:12.065991    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:12.066013    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:12.066025    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:12.066038    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:12.070105    4003 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:44:12.070531    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:12.070540    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:12.070548    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:12.070553    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:12.072400    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:12.566716    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:12.566745    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:12.566756    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:12.566762    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:12.570548    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:12.570980    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:12.570991    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:12.571000    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:12.571005    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:12.573075    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:12.573392    4003 pod_ready.go:103] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"False"
	I0831 15:44:13.065503    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:13.065529    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:13.065540    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:13.065545    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:13.069028    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:13.069606    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:13.069616    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:13.069624    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:13.069628    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:13.071291    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:13.566706    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:13.566719    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:13.566724    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:13.566727    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:13.568695    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:13.569316    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:13.569324    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:13.569330    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:13.569340    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:13.570910    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:14.066070    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:14.066097    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:14.066110    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:14.066122    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:14.069846    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:14.070280    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:14.070288    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:14.070294    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:14.070298    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:14.071983    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:14.565072    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:14.565092    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:14.565103    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:14.565121    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:14.568991    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:14.569470    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:14.569478    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:14.569486    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:14.569489    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:14.571194    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:15.065570    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:15.065590    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:15.065602    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:15.065608    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:15.069259    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:15.069742    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:15.069750    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:15.069756    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:15.069761    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:15.071256    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:15.071608    4003 pod_ready.go:103] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"False"
	I0831 15:44:15.565664    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:15.565722    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:15.565736    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:15.565743    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:15.568446    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:15.568953    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:15.568960    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:15.568966    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:15.568969    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:15.570393    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:16.066647    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:16.066673    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:16.066683    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:16.066689    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:16.069968    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:16.070667    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:16.070678    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:16.070686    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:16.070700    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:16.072421    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:16.565080    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:16.565093    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:16.565100    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:16.565105    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:16.567016    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:16.567805    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:16.567814    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:16.567819    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:16.567829    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:16.569508    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:17.065211    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:17.065233    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:17.065243    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:17.065249    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:17.068848    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:17.069431    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:17.069442    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:17.069451    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:17.069454    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:17.071237    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:17.565694    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:17.565715    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:17.565726    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:17.565732    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:17.569041    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:17.569625    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:17.569632    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:17.569638    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:17.569648    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:17.571537    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:17.572005    4003 pod_ready.go:103] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"False"
	I0831 15:44:18.065338    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:18.065353    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:18.065361    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:18.065365    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:18.067574    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:18.067956    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:18.067963    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:18.067969    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:18.067973    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:18.069437    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:18.565941    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:18.565963    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:18.565974    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:18.565984    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:18.569115    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:18.569832    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:18.569842    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:18.569850    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:18.569854    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:18.571574    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:19.065517    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:19.065533    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:19.065540    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:19.065545    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:19.068125    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:19.068655    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:19.068662    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:19.068667    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:19.068672    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:19.070197    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:19.566293    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:19.566372    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:19.566385    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:19.566395    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:19.569750    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:19.570211    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:19.570219    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:19.570224    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:19.570229    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:19.571922    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:19.572415    4003 pod_ready.go:103] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"False"
	I0831 15:44:20.065051    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:20.065066    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:20.065073    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:20.065078    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:20.070133    4003 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0831 15:44:20.070557    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:20.070565    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:20.070570    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:20.070573    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:20.072277    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:20.566009    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:20.566031    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:20.566042    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:20.566051    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:20.570001    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:20.570447    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:20.570453    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:20.570458    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:20.570460    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:20.572199    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:21.065187    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:21.065210    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:21.065222    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:21.065227    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:21.067898    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:21.068345    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:21.068353    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:21.068358    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:21.068362    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:21.069742    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:21.565705    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:21.565724    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:21.565735    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:21.565741    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:21.568938    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:21.569630    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:21.569641    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:21.569647    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:21.569655    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:21.571194    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:22.065027    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:22.065051    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:22.065062    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:22.065100    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:22.069375    4003 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:44:22.069729    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:22.069737    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:22.069743    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:22.069747    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:22.072208    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:22.072476    4003 pod_ready.go:103] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"False"
	I0831 15:44:22.565894    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:22.565917    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:22.565928    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:22.565937    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:22.569490    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:22.569886    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:22.569893    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:22.569899    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:22.569903    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:22.571462    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:23.066179    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:23.066201    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:23.066212    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:23.066219    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:23.070218    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:23.070845    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:23.070855    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:23.070862    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:23.070867    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:23.072481    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:23.565085    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:23.565099    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:23.565105    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:23.565109    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:23.567899    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:23.568397    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:23.568405    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:23.568411    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:23.568431    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:23.571121    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:24.065227    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:24.065249    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:24.065261    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:24.065270    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:24.068196    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:24.068774    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:24.068782    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:24.068787    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:24.068791    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:24.070317    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:24.565319    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:24.565332    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:24.565337    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:24.565340    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:24.567104    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:24.567586    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:24.567594    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:24.567600    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:24.567603    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:24.569279    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:24.569664    4003 pod_ready.go:103] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"False"
	I0831 15:44:25.066218    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:25.066244    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.066255    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.066260    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.069969    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:25.070824    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:25.070848    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.070854    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.070863    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.072406    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.072709    4003 pod_ready.go:93] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:25.072718    4003 pod_ready.go:82] duration metric: took 16.507839534s for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.072727    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.072755    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-d45q5
	I0831 15:44:25.072760    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.072765    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.072769    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.074170    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.074584    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:25.074591    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.074596    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.074599    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.076066    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:44:25.076162    4003 pod_ready.go:98] node "ha-949000-m03" hosting pod "kube-proxy-d45q5" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:25.076170    4003 pod_ready.go:82] duration metric: took 3.437579ms for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	E0831 15:44:25.076175    4003 pod_ready.go:67] WaitExtra: waitPodCondition: node "ha-949000-m03" hosting pod "kube-proxy-d45q5" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:25.076179    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.076206    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:44:25.076211    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.076216    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.076219    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.077746    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.078120    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:25.078127    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.078133    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.078136    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.079498    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.079894    4003 pod_ready.go:93] pod "kube-proxy-q7ndn" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:25.079903    4003 pod_ready.go:82] duration metric: took 3.717598ms for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.079909    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.079936    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:44:25.079941    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.079946    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.079951    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.081600    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.081932    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:25.081940    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.081946    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.081949    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.083262    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.083552    4003 pod_ready.go:93] pod "kube-scheduler-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:25.083561    4003 pod_ready.go:82] duration metric: took 3.647661ms for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.083567    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.083594    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:44:25.083603    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.083609    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.083614    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.085111    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.085438    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:25.085446    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.085452    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.085455    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.087068    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.087348    4003 pod_ready.go:93] pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:25.087357    4003 pod_ready.go:82] duration metric: took 3.784951ms for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.087363    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.267294    4003 request.go:632] Waited for 179.857802ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:44:25.267367    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:44:25.267377    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.267389    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.267395    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.271053    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:25.466554    4003 request.go:632] Waited for 195.015611ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:25.466691    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:25.466701    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.466712    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.466721    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.470050    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:44:25.470127    4003 pod_ready.go:98] node "ha-949000-m03" hosting pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:25.470148    4003 pod_ready.go:82] duration metric: took 382.775358ms for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	E0831 15:44:25.470158    4003 pod_ready.go:67] WaitExtra: waitPodCondition: node "ha-949000-m03" hosting pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:25.470165    4003 pod_ready.go:39] duration metric: took 19.509491295s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:44:25.470190    4003 api_server.go:52] waiting for apiserver process to appear ...
	I0831 15:44:25.470257    4003 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:44:25.483780    4003 api_server.go:72] duration metric: took 28.374703678s to wait for apiserver process to appear ...
	I0831 15:44:25.483792    4003 api_server.go:88] waiting for apiserver healthz status ...
	I0831 15:44:25.483807    4003 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0831 15:44:25.486833    4003 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0831 15:44:25.486870    4003 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0831 15:44:25.486875    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.486882    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.486887    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.487354    4003 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 15:44:25.487409    4003 api_server.go:141] control plane version: v1.31.0
	I0831 15:44:25.487417    4003 api_server.go:131] duration metric: took 3.620759ms to wait for apiserver health ...
	I0831 15:44:25.487424    4003 system_pods.go:43] waiting for kube-system pods to appear ...
	I0831 15:44:25.666509    4003 request.go:632] Waited for 179.03877ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:44:25.666550    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:44:25.666557    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.666565    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.666601    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.670513    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:25.675202    4003 system_pods.go:59] 24 kube-system pods found
	I0831 15:44:25.675220    4003 system_pods.go:61] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:44:25.675225    4003 system_pods.go:61] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:44:25.675229    4003 system_pods.go:61] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:44:25.675232    4003 system_pods.go:61] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:44:25.675236    4003 system_pods.go:61] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:44:25.675238    4003 system_pods.go:61] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:44:25.675241    4003 system_pods.go:61] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:44:25.675244    4003 system_pods.go:61] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:44:25.675247    4003 system_pods.go:61] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:44:25.675249    4003 system_pods.go:61] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:44:25.675252    4003 system_pods.go:61] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:44:25.675255    4003 system_pods.go:61] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:44:25.675258    4003 system_pods.go:61] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:44:25.675261    4003 system_pods.go:61] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:44:25.675263    4003 system_pods.go:61] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:44:25.675266    4003 system_pods.go:61] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:44:25.675268    4003 system_pods.go:61] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:44:25.675271    4003 system_pods.go:61] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:44:25.675274    4003 system_pods.go:61] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:44:25.675280    4003 system_pods.go:61] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:44:25.675283    4003 system_pods.go:61] "kube-vip-ha-949000" [98967a2c-6641-4193-b7ce-c0fbdee58344] Running
	I0831 15:44:25.675286    4003 system_pods.go:61] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:44:25.675288    4003 system_pods.go:61] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:44:25.675292    4003 system_pods.go:61] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:44:25.675296    4003 system_pods.go:74] duration metric: took 187.866388ms to wait for pod list to return data ...
	I0831 15:44:25.675301    4003 default_sa.go:34] waiting for default service account to be created ...
	I0831 15:44:25.866631    4003 request.go:632] Waited for 191.264353ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:44:25.866761    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:44:25.866771    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.866783    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.866789    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.870307    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:25.870649    4003 default_sa.go:45] found service account: "default"
	I0831 15:44:25.870663    4003 default_sa.go:55] duration metric: took 195.354455ms for default service account to be created ...
	I0831 15:44:25.870670    4003 system_pods.go:116] waiting for k8s-apps to be running ...
	I0831 15:44:26.067233    4003 request.go:632] Waited for 196.47603ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:44:26.067280    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:44:26.067290    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:26.067301    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:26.067307    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:26.072107    4003 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:44:26.077415    4003 system_pods.go:86] 24 kube-system pods found
	I0831 15:44:26.077426    4003 system_pods.go:89] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:44:26.077431    4003 system_pods.go:89] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:44:26.077435    4003 system_pods.go:89] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:44:26.077439    4003 system_pods.go:89] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:44:26.077442    4003 system_pods.go:89] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:44:26.077446    4003 system_pods.go:89] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:44:26.077448    4003 system_pods.go:89] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:44:26.077451    4003 system_pods.go:89] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:44:26.077454    4003 system_pods.go:89] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:44:26.077459    4003 system_pods.go:89] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:44:26.077462    4003 system_pods.go:89] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:44:26.077467    4003 system_pods.go:89] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:44:26.077470    4003 system_pods.go:89] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:44:26.077473    4003 system_pods.go:89] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:44:26.077477    4003 system_pods.go:89] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:44:26.077479    4003 system_pods.go:89] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:44:26.077482    4003 system_pods.go:89] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:44:26.077485    4003 system_pods.go:89] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:44:26.077488    4003 system_pods.go:89] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:44:26.077491    4003 system_pods.go:89] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:44:26.077494    4003 system_pods.go:89] "kube-vip-ha-949000" [98967a2c-6641-4193-b7ce-c0fbdee58344] Running
	I0831 15:44:26.077497    4003 system_pods.go:89] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:44:26.077499    4003 system_pods.go:89] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:44:26.077502    4003 system_pods.go:89] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:44:26.077506    4003 system_pods.go:126] duration metric: took 206.829ms to wait for k8s-apps to be running ...
	I0831 15:44:26.077512    4003 system_svc.go:44] waiting for kubelet service to be running ....
	I0831 15:44:26.077564    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:44:26.088970    4003 system_svc.go:56] duration metric: took 11.450852ms WaitForService to wait for kubelet
	I0831 15:44:26.088985    4003 kubeadm.go:582] duration metric: took 28.979903586s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:44:26.088998    4003 node_conditions.go:102] verifying NodePressure condition ...
	I0831 15:44:26.266791    4003 request.go:632] Waited for 177.710266ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0831 15:44:26.266867    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0831 15:44:26.266875    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:26.266886    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:26.266896    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:26.270407    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:26.271146    4003 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:44:26.271167    4003 node_conditions.go:123] node cpu capacity is 2
	I0831 15:44:26.271180    4003 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:44:26.271188    4003 node_conditions.go:123] node cpu capacity is 2
	I0831 15:44:26.271193    4003 node_conditions.go:105] duration metric: took 182.189243ms to run NodePressure ...
	I0831 15:44:26.271203    4003 start.go:241] waiting for startup goroutines ...
	I0831 15:44:26.271229    4003 start.go:255] writing updated cluster config ...
	I0831 15:44:26.293325    4003 out.go:201] 
	I0831 15:44:26.315324    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:44:26.315453    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:44:26.337988    4003 out.go:177] * Starting "ha-949000-m04" worker node in "ha-949000" cluster
	I0831 15:44:26.380685    4003 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:44:26.380719    4003 cache.go:56] Caching tarball of preloaded images
	I0831 15:44:26.380921    4003 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:44:26.380941    4003 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:44:26.381080    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:44:26.382207    4003 start.go:360] acquireMachinesLock for ha-949000-m04: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:44:26.382290    4003 start.go:364] duration metric: took 66.399µs to acquireMachinesLock for "ha-949000-m04"
	I0831 15:44:26.382307    4003 start.go:96] Skipping create...Using existing machine configuration
	I0831 15:44:26.382314    4003 fix.go:54] fixHost starting: m04
	I0831 15:44:26.382612    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:44:26.382638    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:44:26.391652    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52102
	I0831 15:44:26.391986    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:44:26.392342    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:44:26.392365    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:44:26.392613    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:44:26.392733    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:44:26.392824    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetState
	I0831 15:44:26.392912    4003 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:44:26.392996    4003 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3806
	I0831 15:44:26.393933    4003 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid 3806 missing from process table
	I0831 15:44:26.393956    4003 fix.go:112] recreateIfNeeded on ha-949000-m04: state=Stopped err=<nil>
	I0831 15:44:26.393965    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	W0831 15:44:26.394099    4003 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 15:44:26.414853    4003 out.go:177] * Restarting existing hyperkit VM for "ha-949000-m04" ...
	I0831 15:44:26.456728    4003 main.go:141] libmachine: (ha-949000-m04) Calling .Start
	I0831 15:44:26.457073    4003 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:44:26.457142    4003 main.go:141] libmachine: (ha-949000-m04) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid
	I0831 15:44:26.457233    4003 main.go:141] libmachine: (ha-949000-m04) DBG | Using UUID 5ee34770-2239-4427-9789-bd204fe095a6
	I0831 15:44:26.482643    4003 main.go:141] libmachine: (ha-949000-m04) DBG | Generated MAC 8a:3c:61:5f:c5:84
	I0831 15:44:26.482668    4003 main.go:141] libmachine: (ha-949000-m04) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:44:26.482825    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"5ee34770-2239-4427-9789-bd204fe095a6", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001201e0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:44:26.482873    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"5ee34770-2239-4427-9789-bd204fe095a6", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001201e0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:44:26.482921    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "5ee34770-2239-4427-9789-bd204fe095a6", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/ha-949000-m04.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m04/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:44:26.482962    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 5ee34770-2239-4427-9789-bd204fe095a6 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/ha-949000-m04.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:44:26.482975    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:44:26.484373    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 DEBUG: hyperkit: Pid is 4071
	I0831 15:44:26.484859    4003 main.go:141] libmachine: (ha-949000-m04) DBG | Attempt 0
	I0831 15:44:26.484876    4003 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:44:26.484959    4003 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 4071
	I0831 15:44:26.487135    4003 main.go:141] libmachine: (ha-949000-m04) DBG | Searching for 8a:3c:61:5f:c5:84 in /var/db/dhcpd_leases ...
	I0831 15:44:26.487196    4003 main.go:141] libmachine: (ha-949000-m04) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0831 15:44:26.487221    4003 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 15:44:26.487236    4003 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 15:44:26.487249    4003 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39c5e}
	I0831 15:44:26.487264    4003 main.go:141] libmachine: (ha-949000-m04) DBG | Found match: 8a:3c:61:5f:c5:84
	I0831 15:44:26.487276    4003 main.go:141] libmachine: (ha-949000-m04) DBG | IP: 192.169.0.8
	I0831 15:44:26.487302    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetConfigRaw
	I0831 15:44:26.488058    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:44:26.488267    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:44:26.488733    4003 machine.go:93] provisionDockerMachine start ...
	I0831 15:44:26.488743    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:44:26.488866    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:44:26.488967    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:44:26.489052    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:44:26.489152    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:44:26.489235    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:44:26.489342    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:44:26.489512    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:44:26.489524    4003 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 15:44:26.492093    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:44:26.500227    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:44:26.501190    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:44:26.501211    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:44:26.501222    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:44:26.501234    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:44:26.887163    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:44:26.887179    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:44:27.001897    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:44:27.001917    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:44:27.001935    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:44:27.001949    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:44:27.002783    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:27 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:44:27.002794    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:27 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:44:32.603005    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:32 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:44:32.603055    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:32 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:44:32.603066    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:32 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:44:32.626242    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:32 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:45:01.551772    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 15:45:01.551791    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetMachineName
	I0831 15:45:01.551924    4003 buildroot.go:166] provisioning hostname "ha-949000-m04"
	I0831 15:45:01.551935    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetMachineName
	I0831 15:45:01.552030    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:01.552119    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:01.552201    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.552291    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.552372    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:01.552497    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:45:01.552634    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:45:01.552642    4003 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m04 && echo "ha-949000-m04" | sudo tee /etc/hostname
	I0831 15:45:01.616885    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m04
	
	I0831 15:45:01.616906    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:01.617041    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:01.617145    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.617232    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.617317    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:01.617452    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:45:01.617606    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:45:01.617618    4003 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m04' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m04/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m04' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:45:01.675471    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:45:01.675486    4003 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:45:01.675499    4003 buildroot.go:174] setting up certificates
	I0831 15:45:01.675505    4003 provision.go:84] configureAuth start
	I0831 15:45:01.675512    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetMachineName
	I0831 15:45:01.675643    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:45:01.675763    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:01.675858    4003 provision.go:143] copyHostCerts
	I0831 15:45:01.675886    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:45:01.675959    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:45:01.675965    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:45:01.676118    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:45:01.676365    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:45:01.676407    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:45:01.676412    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:45:01.676500    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:45:01.676663    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:45:01.676709    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:45:01.676714    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:45:01.676793    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:45:01.676940    4003 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m04 san=[127.0.0.1 192.169.0.8 ha-949000-m04 localhost minikube]
	I0831 15:45:01.762314    4003 provision.go:177] copyRemoteCerts
	I0831 15:45:01.762367    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:45:01.762382    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:01.762557    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:01.762656    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.762756    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:01.762844    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:45:01.796205    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:45:01.796279    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:45:01.815211    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:45:01.815279    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:45:01.834188    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:45:01.834257    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0831 15:45:01.853640    4003 provision.go:87] duration metric: took 178.124085ms to configureAuth
	I0831 15:45:01.853653    4003 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:45:01.853819    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:45:01.853832    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:45:01.853954    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:01.854036    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:01.854122    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.854210    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.854294    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:01.854407    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:45:01.854531    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:45:01.854538    4003 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:45:01.906456    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:45:01.906469    4003 buildroot.go:70] root file system type: tmpfs
	I0831 15:45:01.906548    4003 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:45:01.906561    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:01.906689    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:01.906786    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.906885    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.906960    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:01.907078    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:45:01.907226    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:45:01.907270    4003 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:45:01.970284    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:45:01.970303    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:01.970453    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:01.970548    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.970632    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.970725    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:01.970876    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:45:01.971019    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:45:01.971040    4003 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:45:03.516394    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:45:03.516410    4003 machine.go:96] duration metric: took 37.027272003s to provisionDockerMachine
	I0831 15:45:03.516419    4003 start.go:293] postStartSetup for "ha-949000-m04" (driver="hyperkit")
	I0831 15:45:03.516426    4003 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:45:03.516446    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:45:03.516635    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:45:03.516649    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:03.516745    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:03.516831    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:03.516911    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:03.517003    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:45:03.549510    4003 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:45:03.552575    4003 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:45:03.552586    4003 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:45:03.552685    4003 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:45:03.552861    4003 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:45:03.552868    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:45:03.553075    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:45:03.560251    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:45:03.579932    4003 start.go:296] duration metric: took 63.505056ms for postStartSetup
	I0831 15:45:03.579953    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:45:03.580123    4003 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0831 15:45:03.580137    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:03.580227    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:03.580304    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:03.580383    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:03.580463    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:45:03.613355    4003 machine.go:197] restoring vm config from /var/lib/minikube/backup: [etc]
	I0831 15:45:03.613415    4003 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0831 15:45:03.667593    4003 fix.go:56] duration metric: took 37.284874453s for fixHost
	I0831 15:45:03.667632    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:03.667887    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:03.668092    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:03.668253    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:03.668442    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:03.668679    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:45:03.668942    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:45:03.668957    4003 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:45:03.721925    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725144303.791568584
	
	I0831 15:45:03.721940    4003 fix.go:216] guest clock: 1725144303.791568584
	I0831 15:45:03.721945    4003 fix.go:229] Guest: 2024-08-31 15:45:03.791568584 -0700 PDT Remote: 2024-08-31 15:45:03.667616 -0700 PDT m=+127.803695939 (delta=123.952584ms)
	I0831 15:45:03.721980    4003 fix.go:200] guest clock delta is within tolerance: 123.952584ms
	I0831 15:45:03.721984    4003 start.go:83] releasing machines lock for "ha-949000-m04", held for 37.339285395s
	I0831 15:45:03.722007    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:45:03.722145    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:45:03.745774    4003 out.go:177] * Found network options:
	I0831 15:45:03.767373    4003 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6
	W0831 15:45:03.788896    4003 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:45:03.788955    4003 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:45:03.788975    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:45:03.789814    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:45:03.790060    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:45:03.790166    4003 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:45:03.790203    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	W0831 15:45:03.790303    4003 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:45:03.790355    4003 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:45:03.790430    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:03.790462    4003 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:45:03.790479    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:03.790581    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:03.790645    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:03.790761    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:03.790846    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:03.790934    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:45:03.791028    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:03.791215    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	W0831 15:45:03.820995    4003 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:45:03.821055    4003 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:45:03.865115    4003 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:45:03.865138    4003 start.go:495] detecting cgroup driver to use...
	I0831 15:45:03.865245    4003 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:45:03.881224    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:45:03.890437    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:45:03.899610    4003 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:45:03.899666    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:45:03.908938    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:45:03.918184    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:45:03.927312    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:45:03.936702    4003 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:45:03.946157    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:45:03.955222    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:45:03.964152    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:45:03.973257    4003 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:45:03.981558    4003 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:45:03.989901    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:45:04.086014    4003 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:45:04.105538    4003 start.go:495] detecting cgroup driver to use...
	I0831 15:45:04.105610    4003 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:45:04.121430    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:45:04.134788    4003 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:45:04.151049    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:45:04.161844    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:45:04.172949    4003 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:45:04.191373    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:45:04.201771    4003 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:45:04.216770    4003 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:45:04.219760    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:45:04.226792    4003 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:45:04.240592    4003 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:45:04.340799    4003 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:45:04.439649    4003 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:45:04.439671    4003 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:45:04.453918    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:45:04.542337    4003 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:45:06.812888    4003 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.270508765s)
	I0831 15:45:06.812949    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:45:06.823181    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:45:06.833531    4003 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:45:06.936150    4003 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:45:07.044179    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:45:07.137898    4003 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:45:07.152258    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:45:07.163263    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:45:07.258016    4003 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:45:07.318759    4003 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:45:07.318841    4003 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:45:07.323364    4003 start.go:563] Will wait 60s for crictl version
	I0831 15:45:07.323422    4003 ssh_runner.go:195] Run: which crictl
	I0831 15:45:07.326572    4003 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:45:07.358444    4003 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:45:07.358520    4003 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:45:07.376088    4003 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:45:07.414824    4003 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:45:07.456544    4003 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:45:07.477408    4003 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0831 15:45:07.498401    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:45:07.498760    4003 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:45:07.503179    4003 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:45:07.513368    4003 mustload.go:65] Loading cluster: ha-949000
	I0831 15:45:07.513553    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:45:07.513782    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:45:07.513810    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:45:07.522673    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52124
	I0831 15:45:07.523026    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:45:07.523408    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:45:07.523425    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:45:07.523666    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:45:07.523786    4003 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:45:07.523871    4003 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:45:07.523962    4003 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 4017
	I0831 15:45:07.524938    4003 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:45:07.525205    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:45:07.525236    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:45:07.534543    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52126
	I0831 15:45:07.534878    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:45:07.535207    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:45:07.535219    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:45:07.535443    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:45:07.535559    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:45:07.535653    4003 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.8
	I0831 15:45:07.535660    4003 certs.go:194] generating shared ca certs ...
	I0831 15:45:07.535672    4003 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:45:07.535838    4003 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:45:07.535909    4003 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:45:07.535919    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:45:07.535943    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:45:07.535961    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:45:07.535978    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:45:07.536528    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:45:07.536755    4003 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:45:07.536797    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:45:07.536911    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:45:07.536985    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:45:07.537034    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:45:07.537191    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:45:07.537301    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:45:07.537538    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:45:07.537562    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:45:07.537590    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:45:07.557183    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:45:07.576458    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:45:07.595921    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:45:07.615402    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:45:07.634516    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:45:07.653693    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:45:07.673154    4003 ssh_runner.go:195] Run: openssl version
	I0831 15:45:07.677604    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:45:07.686971    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:45:07.690415    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:45:07.690457    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:45:07.694634    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:45:07.703764    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:45:07.713184    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:45:07.716497    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:45:07.716528    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:45:07.720770    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:45:07.729910    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:45:07.739116    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:45:07.742456    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:45:07.742497    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:45:07.746707    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:45:07.755769    4003 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:45:07.758843    4003 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 15:45:07.758878    4003 kubeadm.go:934] updating node {m04 192.169.0.8 0 v1.31.0  false true} ...
	I0831 15:45:07.758938    4003 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m04 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.8
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:45:07.758975    4003 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:45:07.767346    4003 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:45:07.767390    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system
	I0831 15:45:07.775359    4003 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:45:07.788534    4003 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:45:07.801886    4003 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:45:07.804685    4003 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:45:07.814373    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:45:07.913307    4003 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:45:07.928102    4003 start.go:235] Will wait 6m0s for node &{Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime: ControlPlane:false Worker:true}
	I0831 15:45:07.928288    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:45:07.970019    4003 out.go:177] * Verifying Kubernetes components...
	I0831 15:45:07.990872    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:45:08.095722    4003 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:45:08.845027    4003 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:45:08.845280    4003 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52edc00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:45:08.845323    4003 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:45:08.845512    4003 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m04" to be "Ready" ...
	I0831 15:45:08.845557    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:08.845562    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:08.845568    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:08.845571    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:08.847724    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:09.347758    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:09.347784    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:09.347795    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:09.347801    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:09.351055    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:09.846963    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:09.846989    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:09.847000    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:09.847007    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:09.850830    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:10.346983    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:10.346994    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:10.347000    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:10.347004    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:10.349173    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:10.845886    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:10.845909    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:10.845920    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:10.845929    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:10.848792    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:10.848857    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:11.347504    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:11.347528    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:11.347539    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:11.347545    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:11.350440    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:11.846697    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:11.846722    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:11.846744    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:11.846747    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:11.848994    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:12.346908    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:12.346932    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:12.346943    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:12.346949    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:12.349967    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:12.846545    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:12.846570    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:12.846582    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:12.846586    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:12.850076    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:12.850171    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:13.345681    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:13.345701    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:13.345708    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:13.345713    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:13.347803    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:13.846672    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:13.846700    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:13.846713    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:13.846719    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:13.850213    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:14.346092    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:14.346108    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:14.346114    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:14.346118    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:14.348283    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:14.846918    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:14.846932    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:14.846938    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:14.846941    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:14.849111    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:15.346636    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:15.346651    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:15.346661    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:15.346691    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:15.348385    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:15.348441    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:15.846720    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:15.846746    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:15.846757    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:15.846800    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:15.850040    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:16.346787    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:16.346802    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:16.346807    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:16.346810    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:16.349402    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:16.846242    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:16.846267    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:16.846279    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:16.846285    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:16.849465    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:17.346142    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:17.346155    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:17.346163    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:17.346166    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:17.350190    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:45:17.350267    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:17.846549    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:17.846563    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:17.846569    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:17.846574    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:17.848738    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:18.346533    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:18.346558    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:18.346628    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:18.346635    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:18.349746    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:18.845790    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:18.845803    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:18.845810    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:18.845813    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:18.852753    4003 round_trippers.go:574] Response Status: 404 Not Found in 6 milliseconds
	I0831 15:45:19.345910    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:19.345921    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:19.345927    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:19.345930    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:19.348239    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:19.846161    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:19.846188    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:19.846205    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:19.846222    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:19.849249    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:19.849335    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:20.347424    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:20.347504    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:20.347518    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:20.347524    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:20.350150    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:20.845819    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:20.845835    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:20.845842    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:20.845845    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:20.848156    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:21.347305    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:21.347322    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:21.347329    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:21.347334    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:21.349936    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:21.846477    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:21.846497    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:21.846509    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:21.846518    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:21.849139    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:22.346802    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:22.346822    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:22.346830    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:22.346842    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:22.348962    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:22.349019    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:22.847375    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:22.847401    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:22.847456    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:22.847466    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:22.850916    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:23.347018    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:23.347030    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:23.347037    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:23.347041    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:23.348873    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:23.846396    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:23.846412    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:23.846418    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:23.846421    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:23.848619    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:24.346563    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:24.346587    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:24.346598    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:24.346605    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:24.349517    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:24.349596    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:24.847762    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:24.847788    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:24.847799    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:24.847807    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:24.850902    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:25.346975    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:25.346987    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:25.346993    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:25.346996    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:25.349147    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:25.846141    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:25.846199    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:25.846211    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:25.846217    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:25.849027    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:26.346014    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:26.346036    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:26.346047    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:26.346053    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:26.349317    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:26.846724    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:26.846739    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:26.846745    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:26.846748    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:26.848768    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:26.848825    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:27.347046    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:27.347061    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:27.347084    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:27.347088    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:27.349358    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:27.847241    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:27.847266    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:27.847278    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:27.847284    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:27.850635    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:28.346098    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:28.346111    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:28.346118    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:28.346120    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:28.348238    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:28.846743    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:28.846769    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:28.846780    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:28.846788    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:28.850051    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:28.850126    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:29.347209    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:29.347223    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:29.347230    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:29.347234    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:29.349262    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:29.847853    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:29.847871    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:29.847899    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:29.847903    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:29.850095    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:30.346592    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:30.346613    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:30.346624    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:30.346630    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:30.349712    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:30.846746    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:30.846772    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:30.846782    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:30.846787    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:30.850071    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:30.850159    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:31.347223    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:31.347268    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:31.347276    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:31.347280    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:31.349187    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:31.846144    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:31.846163    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:31.846180    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:31.846184    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:31.848310    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:32.346217    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:32.346239    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:32.346248    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:32.346254    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:32.348537    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:32.846981    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:32.846996    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:32.847003    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:32.847010    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:32.848991    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:33.346415    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:33.346427    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:33.346433    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:33.346436    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:33.348444    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:33.348503    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:33.845996    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:33.846023    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:33.846066    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:33.846076    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:33.849334    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:34.347376    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:34.347391    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:34.347398    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:34.347401    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:34.349645    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:34.848093    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:34.848113    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:34.848126    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:34.848134    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:34.851450    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:35.346386    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:35.346405    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:35.346416    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:35.346421    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:35.349660    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:35.349728    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:35.846776    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:35.846793    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:35.846799    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:35.846803    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:35.848988    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:36.348020    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:36.348045    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:36.348055    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:36.348061    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:36.351289    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:36.846442    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:36.846466    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:36.846478    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:36.846485    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:36.849727    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:37.346585    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:37.346598    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:37.346604    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:37.346608    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:37.348823    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:37.846395    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:37.846414    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:37.846425    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:37.846431    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:37.849318    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:37.849429    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:38.347018    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:38.347043    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:38.347055    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:38.347059    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:38.350460    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:38.847528    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:38.847544    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:38.847550    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:38.847554    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:38.849461    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:39.346721    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:39.346741    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:39.346752    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:39.346758    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:39.349742    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:39.846123    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:39.846146    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:39.846158    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:39.846164    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:39.849435    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:39.849503    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:40.346540    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:40.346552    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:40.346558    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:40.346560    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:40.348654    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:40.846152    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:40.846173    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:40.846184    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:40.846206    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:40.849347    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:41.346538    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:41.346550    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:41.346556    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:41.346560    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:41.348413    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:41.846620    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:41.846633    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:41.846639    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:41.846642    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:41.848943    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:42.347207    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:42.347233    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:42.347277    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:42.347287    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:42.350122    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:42.350199    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:42.846206    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:42.846231    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:42.846243    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:42.846251    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:42.849366    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:43.346675    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:43.346691    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:43.346724    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:43.346728    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:43.348764    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:43.846267    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:43.846289    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:43.846301    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:43.846306    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:43.849927    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:44.346504    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:44.346524    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:44.346532    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:44.346540    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:44.349860    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:44.847166    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:44.847180    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:44.847186    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:44.847193    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:44.849509    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:44.849569    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:45.347208    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:45.347222    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:45.347229    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:45.347232    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:45.349172    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:45.846510    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:45.846534    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:45.846545    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:45.846551    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:45.849782    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:46.346141    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:46.346158    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:46.346164    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:46.346167    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:46.347845    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:46.848226    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:46.848252    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:46.848263    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:46.848271    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:46.851712    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:46.851793    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:47.346279    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:47.346291    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:47.346297    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:47.346300    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:47.349863    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:47.847020    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:47.847037    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:47.847043    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:47.847046    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:47.848989    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:48.346969    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:48.346995    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:48.347053    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:48.347063    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:48.350507    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:48.847023    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:48.847043    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:48.847054    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:48.847060    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:48.850155    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:49.348069    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:49.348085    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:49.348091    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:49.348097    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:49.350031    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:49.350125    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:49.846786    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:49.846812    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:49.846834    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:49.846844    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:49.850238    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:50.347108    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:50.347128    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:50.347139    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:50.347144    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:50.350196    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:50.846164    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:50.846180    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:50.846186    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:50.846190    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:50.848092    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:51.347436    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:51.347460    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:51.347471    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:51.347477    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:51.351123    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:51.351195    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:51.847405    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:51.847419    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:51.847428    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:51.847433    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:51.849913    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:52.347071    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:52.347083    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:52.347093    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:52.347096    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:52.349220    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:52.847909    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:52.847932    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:52.847943    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:52.847951    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:52.851063    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:53.346206    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:53.346218    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:53.346224    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:53.346228    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:53.348204    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:53.847919    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:53.847935    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:53.847941    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:53.847945    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:53.849907    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:53.850011    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:54.347348    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:54.347369    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:54.347380    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:54.347385    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:54.351482    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:45:54.846431    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:54.846455    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:54.846466    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:54.846471    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:54.849557    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:55.348109    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:55.348121    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:55.348128    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:55.348131    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:55.350338    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:55.848148    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:55.848170    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:55.848181    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:55.848200    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:55.851241    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:55.851306    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:56.347660    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:56.347686    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:56.347697    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:56.347703    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:56.351077    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:56.847124    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:56.847140    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:56.847146    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:56.847159    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:56.849236    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:57.347401    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:57.347416    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:57.347444    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:57.347448    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:57.350002    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:57.847762    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:57.847778    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:57.847786    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:57.847792    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:57.849933    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:58.347634    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:58.347646    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:58.347652    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:58.347654    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:58.349839    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:58.349896    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:58.846561    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:58.846632    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:58.846645    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:58.846652    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:58.849247    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:59.347174    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:59.347196    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:59.347208    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:59.347215    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:59.350401    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:59.847088    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:59.847103    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:59.847119    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:59.847134    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:59.849352    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:00.347687    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:00.347714    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:00.347726    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:00.347734    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:00.351744    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:00.351819    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:00.848047    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:00.848068    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:00.848079    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:00.848086    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:00.851749    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:01.347871    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:01.347886    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:01.347895    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:01.347899    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:01.350037    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:01.847381    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:01.847403    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:01.847414    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:01.847423    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:01.850418    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:02.347961    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:02.347983    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:02.347992    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:02.347997    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:02.351704    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:02.351882    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:02.846644    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:02.846656    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:02.846663    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:02.846667    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:02.848618    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:03.346482    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:03.346503    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:03.346515    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:03.346522    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:03.349938    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:03.846526    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:03.846556    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:03.846616    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:03.846639    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:03.850171    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:04.346820    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:04.346836    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:04.346843    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:04.346860    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:04.349066    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:04.846842    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:04.846858    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:04.846868    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:04.846873    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:04.848643    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:04.848700    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:05.348383    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:05.348410    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:05.348423    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:05.348481    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:05.351822    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:05.846904    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:05.846917    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:05.846924    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:05.846927    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:05.848737    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:06.347363    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:06.347388    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:06.347426    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:06.347435    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:06.349807    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:06.846388    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:06.846402    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:06.846411    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:06.846417    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:06.848695    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:06.848754    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:07.346938    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:07.346964    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:07.346991    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:07.347032    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:07.350784    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:07.848381    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:07.848408    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:07.848425    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:07.848433    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:07.851814    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:08.348378    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:08.348403    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:08.348415    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:08.348420    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:08.351770    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:08.846356    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:08.846371    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:08.846377    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:08.846382    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:08.848517    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:09.346659    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:09.346686    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:09.346696    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:09.346705    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:09.349594    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:09.349709    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:09.846024    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:09.846037    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:09.846043    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:09.846047    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:09.847975    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:10.346809    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:10.346834    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:10.346845    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:10.346851    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:10.350256    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:10.844381    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:10.844403    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:10.844415    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:10.844422    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:10.847674    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:11.344377    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:11.344394    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:11.344400    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:11.344403    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:11.346485    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:11.843236    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:11.843247    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:11.843253    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:11.843257    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:11.845363    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:11.845422    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:12.343795    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:12.343813    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:12.343825    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:12.343840    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:12.347319    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:12.844111    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:12.844127    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:12.844133    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:12.844135    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:12.845879    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:13.343860    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:13.343887    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:13.343899    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:13.343904    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:13.347005    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:13.842634    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:13.842656    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:13.842668    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:13.842674    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:13.845855    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:13.845928    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:14.341496    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:14.341511    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:14.341518    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:14.341522    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:14.343436    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:14.841234    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:14.841255    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:14.841265    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:14.841270    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:14.844398    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:15.341763    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:15.341785    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:15.341796    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:15.341802    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:15.345605    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:15.840145    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:15.840161    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:15.840167    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:15.840170    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:15.842412    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:16.339596    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:16.339612    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:16.339621    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:16.339625    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:16.341841    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:16.341895    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:16.840537    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:16.840560    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:16.840580    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:16.840588    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:16.844162    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:17.339830    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:17.339847    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:17.339853    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:17.339862    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:17.341955    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:17.838709    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:17.838734    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:17.838745    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:17.838752    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:17.841971    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:18.339902    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:18.339925    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:18.339936    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:18.339942    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:18.343048    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:18.343121    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:18.837997    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:18.838010    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:18.838017    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:18.838020    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:18.842582    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:46:19.339010    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:19.339088    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:19.339099    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:19.339106    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:19.342495    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:19.839240    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:19.839263    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:19.839274    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:19.839283    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:19.842630    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:20.337822    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:20.337838    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:20.337846    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:20.337852    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:20.339893    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:20.838112    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:20.838140    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:20.838153    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:20.838160    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:20.841535    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:20.841611    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:21.336887    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:21.336902    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:21.336911    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:21.336915    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:21.339247    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:21.837400    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:21.837412    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:21.837416    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:21.837421    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:21.839410    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:22.337957    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:22.337984    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:22.338002    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:22.338016    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:22.341209    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:22.837636    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:22.837662    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:22.837673    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:22.837679    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:22.841366    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:22.841502    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:23.337276    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:23.337291    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:23.337304    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:23.337307    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:23.339521    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:23.836608    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:23.836631    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:23.836644    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:23.836651    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:23.839652    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:24.336381    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:24.336438    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:24.336453    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:24.336461    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:24.339223    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:24.834790    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:24.834810    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:24.834835    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:24.834839    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:24.837005    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:25.335102    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:25.335128    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:25.335139    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:25.335148    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:25.338326    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:25.338462    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:25.835276    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:25.835338    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:25.835361    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:25.835369    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:25.838385    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:26.334552    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:26.334565    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:26.334571    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:26.334574    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:26.336860    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:26.834506    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:26.834518    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:26.834524    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:26.834529    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:26.836177    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:27.334080    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:27.334107    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:27.334118    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:27.334125    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:27.337217    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:27.835003    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:27.835014    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:27.835020    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:27.835023    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:27.837029    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:27.837086    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:28.334519    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:28.334541    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:28.334554    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:28.334561    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:28.338051    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:28.834531    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:28.834552    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:28.834564    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:28.834570    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:28.837555    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:29.333171    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:29.333183    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:29.333190    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:29.333193    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:29.335112    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:29.833314    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:29.833337    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:29.833348    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:29.833354    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:29.836452    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:29.836529    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:30.334371    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:30.334430    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:30.334444    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:30.334452    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:30.337476    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:30.833481    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:30.833496    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:30.833502    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:30.833506    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:30.835694    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:31.333667    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:31.333787    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:31.333806    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:31.333812    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:31.337229    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:31.832937    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:31.832963    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:31.832976    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:31.832982    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:31.836197    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:31.836277    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:32.334027    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:32.334042    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:32.334049    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:32.334052    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:32.336000    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:32.832302    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:32.832329    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:32.832340    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:32.832349    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:32.835491    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:33.332732    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:33.332754    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:33.332765    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:33.332774    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:33.336007    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:33.832656    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:33.832672    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:33.832678    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:33.832681    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:33.836925    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:46:33.836986    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:34.332711    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:34.332735    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:34.332744    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:34.332748    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:34.336280    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:34.832778    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:34.832803    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:34.832815    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:34.832821    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:34.836052    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:35.331831    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:35.331847    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:35.331853    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:35.331855    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:35.333909    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:35.833174    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:35.833199    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:35.833210    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:35.833217    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:35.836522    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:35.836602    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:36.331760    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:36.331785    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:36.331797    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:36.331808    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:36.335187    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:36.831430    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:36.831443    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:36.831449    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:36.831452    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:36.833390    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:37.332076    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:37.332102    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:37.332113    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:37.332120    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:37.337064    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:46:37.831843    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:37.831865    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:37.831875    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:37.831882    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:37.834817    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:38.330953    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:38.330969    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:38.330996    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:38.331001    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:38.332836    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:38.332899    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:38.831091    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:38.831111    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:38.831134    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:38.831141    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:38.834085    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:39.330988    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:39.331010    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:39.331021    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:39.331030    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:39.334198    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:39.830708    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:39.830722    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:39.830728    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:39.830731    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:39.833084    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:40.331955    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:40.331978    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:40.331988    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:40.331995    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:40.335663    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:40.335827    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:40.831715    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:40.831736    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:40.831748    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:40.831753    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:40.834480    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:41.331801    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:41.331816    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:41.331824    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:41.331828    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:41.333947    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:41.830652    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:41.830674    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:41.830686    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:41.830692    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:41.833807    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:42.330632    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:42.330682    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:42.330694    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:42.330701    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:42.333713    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:42.830375    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:42.830390    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:42.830397    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:42.830400    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:42.832629    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:42.832686    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:43.330682    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:43.330708    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:43.330719    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:43.330725    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:43.333898    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:43.831092    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:43.831113    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:43.831125    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:43.831132    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:43.834043    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:44.331020    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:44.331035    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:44.331041    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:44.331044    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:44.333218    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:44.830357    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:44.830379    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:44.830390    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:44.830397    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:44.833640    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:44.833710    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:45.331564    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:45.331586    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:45.331598    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:45.331602    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:45.334717    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:45.830842    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:45.830857    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:45.830864    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:45.830868    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:45.832745    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:46.330292    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:46.330318    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:46.330330    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:46.330346    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:46.333844    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:46.830138    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:46.830164    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:46.830175    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:46.830183    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:46.833916    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:46.833987    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:47.330364    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:47.330380    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:47.330386    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:47.330389    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:47.332650    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:47.830666    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:47.830689    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:47.830701    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:47.830710    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:47.833714    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:48.330763    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:48.330784    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:48.330796    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:48.330804    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:48.334071    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:48.831187    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:48.831203    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:48.831209    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:48.831212    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:48.833347    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:49.330476    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:49.330500    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:49.330511    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:49.330517    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:49.333785    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:49.333851    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:49.831216    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:49.831242    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:49.831252    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:49.831272    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:49.834540    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:50.329535    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:50.329548    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:50.329554    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:50.329557    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:50.331810    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:50.829989    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:50.830011    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:50.830022    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:50.830030    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:50.833229    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:51.329962    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:51.329982    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:51.329998    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:51.330005    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:51.333236    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:51.830064    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:51.830077    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:51.830084    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:51.830087    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:51.832177    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:51.832239    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:52.330485    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:52.330510    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:52.330522    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:52.330528    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:52.334017    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:52.830400    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:52.830425    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:52.830436    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:52.830442    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:52.833770    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:53.329566    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:53.329579    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:53.329585    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:53.329589    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:53.331657    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:53.831320    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:53.831345    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:53.831357    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:53.831367    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:53.834615    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:53.834695    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:54.330494    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:54.330520    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:54.330537    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:54.330543    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:54.333826    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:54.830758    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:54.830774    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:54.830780    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:54.830783    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:54.832979    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:55.330573    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:55.330607    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:55.330642    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:55.330652    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:55.334018    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:55.830009    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:55.830030    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:55.830039    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:55.830045    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:55.833311    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:56.329121    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:56.329135    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:56.329150    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:56.329154    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:56.331151    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:56.331267    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:56.829636    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:56.829658    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:56.829676    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:56.829683    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:56.832997    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:57.330100    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:57.330164    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:57.330179    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:57.330185    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:57.332967    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:57.830447    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:57.830460    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:57.830466    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:57.830473    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:57.832494    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:58.330373    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:58.330394    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:58.330406    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:58.330411    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:58.333052    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:58.333119    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:58.829621    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:58.829634    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:58.829640    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:58.829644    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:58.832012    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:59.329472    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:59.329486    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:59.329493    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:59.329497    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:59.331476    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:59.828991    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:59.829004    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:59.829010    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:59.829013    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:59.832279    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:00.329603    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:00.329622    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:00.329633    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:00.329639    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:00.332733    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:00.830103    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:00.830116    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:00.830122    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:00.830125    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:00.837585    4003 round_trippers.go:574] Response Status: 404 Not Found in 7 milliseconds
	I0831 15:47:00.837647    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:01.330092    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:01.330112    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:01.330124    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:01.330132    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:01.333438    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:01.830117    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:01.830142    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:01.830152    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:01.830156    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:01.833106    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:02.330382    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:02.330398    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:02.330411    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:02.330415    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:02.332370    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:02.829065    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:02.829088    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:02.829101    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:02.829108    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:02.831924    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:03.330094    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:03.330120    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:03.330131    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:03.330136    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:03.333449    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:03.333526    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:03.830291    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:03.830308    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:03.830314    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:03.830317    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:03.832083    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:04.330231    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:04.330254    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:04.330289    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:04.330297    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:04.332924    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:04.829724    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:04.829747    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:04.829759    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:04.829767    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:04.833424    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:05.329231    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:05.329246    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:05.329253    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:05.329255    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:05.331317    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:05.828856    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:05.828876    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:05.828887    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:05.828893    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:05.831350    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:05.831420    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:06.329491    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:06.329514    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:06.329526    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:06.329535    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:06.332911    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:06.830113    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:06.830137    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:06.830167    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:06.830171    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:06.832311    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:07.328832    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:07.328852    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:07.328865    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:07.328872    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:07.331707    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:07.830142    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:07.830169    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:07.830210    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:07.830218    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:07.833304    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:07.833425    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:08.330192    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:08.330204    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:08.330211    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:08.330215    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:08.332216    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:08.829708    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:08.829721    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:08.829728    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:08.829731    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:08.832016    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:09.329901    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:09.329921    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:09.329934    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:09.329939    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:09.332962    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:09.829856    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:09.829869    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:09.829876    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:09.829879    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:09.831857    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:10.329372    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:10.329432    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:10.329446    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:10.329452    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:10.332160    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:10.332227    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:10.829229    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:10.829253    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:10.829265    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:10.829272    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:10.833374    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:47:11.330031    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:11.330047    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:11.330053    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:11.330057    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:11.332038    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:11.829331    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:11.829357    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:11.829372    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:11.829379    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:11.832648    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:12.329974    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:12.329989    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:12.329996    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:12.329999    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:12.332071    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:12.830134    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:12.830150    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:12.830156    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:12.830161    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:12.832336    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:12.832389    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:13.329321    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:13.329343    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:13.329353    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:13.329359    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:13.332441    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:13.828908    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:13.828931    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:13.828943    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:13.828950    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:13.832731    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:14.329733    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:14.329748    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:14.329755    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:14.329758    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:14.331982    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:14.830417    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:14.830445    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:14.830486    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:14.830493    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:14.833911    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:14.833990    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:15.328769    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:15.328790    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:15.328802    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:15.328812    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:15.331836    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:15.829268    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:15.829280    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:15.829286    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:15.829290    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:15.831233    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:16.329720    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:16.329739    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:16.329750    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:16.329758    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:16.332304    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:16.829209    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:16.829226    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:16.829234    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:16.829237    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:16.831627    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:17.330054    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:17.330070    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:17.330076    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:17.330079    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:17.332072    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:17.332162    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:17.829699    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:17.829721    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:17.829733    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:17.829738    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:17.833375    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:18.329515    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:18.329535    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:18.329546    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:18.329552    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:18.332114    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:18.829215    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:18.829228    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:18.829234    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:18.829237    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:18.831755    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:19.329707    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:19.329721    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:19.329728    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:19.329733    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:19.331565    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:19.830156    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:19.830177    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:19.830189    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:19.830198    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:19.833385    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:19.833450    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:20.328992    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:20.329004    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:20.329010    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:20.329014    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:20.331474    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:20.829297    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:20.829321    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:20.829332    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:20.829342    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:20.832512    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:21.329420    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:21.329442    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:21.329454    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:21.329460    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:21.332977    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:21.830340    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:21.830375    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:21.830384    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:21.830389    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:21.832344    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:22.330124    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:22.330146    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:22.330157    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:22.330164    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:22.332847    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:22.332923    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:22.829382    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:22.829408    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:22.829452    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:22.829461    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:22.832159    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:23.329407    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:23.329422    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:23.329429    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:23.329432    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:23.331410    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:23.829613    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:23.829636    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:23.829648    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:23.829654    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:23.832995    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:24.328868    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:24.328900    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:24.328966    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:24.328977    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:24.331905    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:24.829531    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:24.829552    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:24.829562    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:24.829567    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:24.832215    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:24.832290    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:25.329491    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:25.329512    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:25.329523    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:25.329531    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:25.332387    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:25.829129    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:25.829150    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:25.829161    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:25.829170    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:25.831914    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:26.329975    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:26.329998    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:26.330010    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:26.330016    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:26.332377    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:26.828755    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:26.828780    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:26.828794    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:26.828801    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:26.832060    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:27.328656    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:27.328678    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:27.328686    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:27.328696    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:27.332476    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:27.332537    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:27.829167    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:27.829178    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:27.829184    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:27.829187    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:27.830801    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:28.329337    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:28.329357    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:28.329368    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:28.329374    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:28.331877    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:28.828686    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:28.828709    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:28.828719    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:28.828725    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:28.831646    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:29.328641    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:29.328656    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:29.328666    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:29.328670    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:29.330609    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:29.828963    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:29.828978    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:29.828984    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:29.828988    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:29.831283    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:29.831335    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:30.329840    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:30.329859    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:30.329870    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:30.329876    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:30.332855    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:30.830461    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:30.830506    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:30.830516    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:30.830521    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:30.832580    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:31.330097    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:31.330110    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:31.330117    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:31.330120    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:31.331769    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:31.828676    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:31.828694    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:31.828706    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:31.828712    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:31.831715    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:31.831786    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:32.328645    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:32.328695    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:32.328704    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:32.328711    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:32.330855    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:32.830681    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:32.830701    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:32.830711    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:32.830717    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:32.833687    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:33.330045    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:33.330067    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:33.330080    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:33.330088    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:33.333035    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:33.829438    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:33.829470    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:33.829481    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:33.829486    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:33.832104    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:33.832209    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:34.329654    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:34.329677    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:34.329691    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:34.329700    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:34.332562    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:34.828622    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:34.828641    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:34.828652    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:34.828657    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:34.831130    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:35.328804    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:35.328825    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:35.328836    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:35.328843    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:35.331419    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:35.829296    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:35.829317    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:35.829329    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:35.829336    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:35.832744    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:35.832822    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:36.328855    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:36.328879    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:36.328890    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:36.328896    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:36.331894    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:36.828612    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:36.828632    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:36.828644    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:36.828650    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:36.831201    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:37.329040    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:37.329061    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:37.329076    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:37.329082    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:37.332359    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:37.828655    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:37.828667    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:37.828673    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:37.828676    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:37.830554    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:38.328877    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:38.328890    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:38.328896    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:38.328900    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:38.330918    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:38.330989    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:38.828965    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:38.828995    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:38.829051    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:38.829058    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:38.832125    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:39.329128    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:39.329186    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:39.329201    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:39.329209    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:39.332056    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:39.829820    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:39.829836    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:39.829842    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:39.829846    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:39.832218    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:40.328814    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:40.328863    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:40.328877    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:40.328883    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:40.331799    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:40.331871    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:40.829864    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:40.829888    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:40.829904    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:40.829911    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:40.832995    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:41.329765    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:41.329783    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:41.329792    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:41.329797    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:41.332227    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:41.830043    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:41.830062    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:41.830073    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:41.830079    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:41.833230    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:42.330723    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:42.330747    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:42.330787    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:42.330795    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:42.333941    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:42.334034    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:42.829938    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:42.829951    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:42.829957    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:42.829960    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:42.831505    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:43.329861    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:43.329884    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:43.329897    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:43.329903    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:43.333660    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:43.829116    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:43.829142    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:43.829154    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:43.829159    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:43.832310    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:44.330318    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:44.330336    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:44.330344    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:44.330350    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:44.332716    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:44.829324    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:44.829351    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:44.829363    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:44.829368    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:44.832857    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:44.832967    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:45.329358    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:45.329380    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:45.329391    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:45.329399    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:45.332784    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:45.829653    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:45.829668    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:45.829675    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:45.829679    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:45.831809    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:46.328769    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:46.328794    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:46.328807    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:46.328812    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:46.331758    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:46.829308    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:46.829333    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:46.829345    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:46.829350    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:46.832699    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:47.330622    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:47.330649    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:47.330701    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:47.330710    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:47.333673    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:47.333746    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:47.829700    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:47.829724    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:47.829735    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:47.829739    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:47.832430    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:48.329697    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:48.329719    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:48.329730    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:48.329739    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:48.333104    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:48.828962    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:48.828977    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:48.828986    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:48.828990    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:48.831389    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:49.329842    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:49.329867    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:49.329876    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:49.329883    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:49.332642    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:49.830281    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:49.830308    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:49.830319    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:49.830327    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:49.833684    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:49.833787    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:50.329816    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:50.329831    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:50.329837    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:50.329842    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:50.331764    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:50.829053    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:50.829076    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:50.829088    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:50.829095    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:50.832256    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:51.330225    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:51.330255    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:51.330270    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:51.330281    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:51.333711    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:51.829842    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:51.829861    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:51.829872    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:51.829878    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:51.832473    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:52.329568    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:52.329595    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:52.329606    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:52.329618    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:52.332450    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:52.332569    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:52.829778    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:52.829805    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:52.829816    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:52.829822    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:52.833363    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:53.329291    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:53.329306    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:53.329313    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:53.329317    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:53.331172    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:53.830270    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:53.830295    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:53.830306    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:53.830314    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:53.833837    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:54.330125    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:54.330151    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:54.330162    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:54.330168    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:54.333461    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:54.333541    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:54.829293    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:54.829321    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:54.829334    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:54.829341    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:54.832226    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:55.330712    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:55.330738    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:55.330749    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:55.330757    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:55.334141    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:55.828817    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:55.828872    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:55.828887    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:55.828895    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:55.831682    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:56.329544    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:56.329568    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:56.329580    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:56.329588    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:56.332148    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:56.830699    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:56.830725    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:56.830736    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:56.830743    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:56.834490    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:56.834565    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:57.329839    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:57.329861    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:57.329873    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:57.329878    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:57.333090    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:57.828829    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:57.828910    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:57.828916    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:57.828920    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:57.830711    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:58.328896    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:58.328923    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:58.328934    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:58.328940    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:58.332463    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:58.828817    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:58.828842    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:58.828854    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:58.828862    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:58.832243    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:59.330153    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:59.330177    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:59.330188    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:59.330193    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:59.333357    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:59.333454    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:59.830783    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:59.830807    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:59.830818    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:59.830876    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:59.834206    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:00.329131    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:00.329150    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:00.329159    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:00.329164    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:00.331350    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:00.830148    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:00.830172    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:00.830238    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:00.830248    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:00.832938    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:01.330744    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:01.330765    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:01.330776    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:01.330781    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:01.334219    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:01.334299    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:01.828849    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:01.828871    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:01.828882    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:01.828890    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:01.832151    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:02.329416    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:02.329435    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:02.329444    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:02.329448    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:02.332568    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:02.829933    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:02.829960    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:02.830044    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:02.830051    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:02.833123    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:03.328950    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:03.328972    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:03.328981    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:03.328989    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:03.331913    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:03.829379    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:03.829445    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:03.829462    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:03.829469    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:03.832420    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:03.832488    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:04.330783    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:04.330808    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:04.330819    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:04.330825    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:04.333835    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:04.828809    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:04.828835    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:04.828844    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:04.828852    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:04.832228    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:05.330083    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:05.330103    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:05.330115    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:05.330122    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:05.333301    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:05.829216    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:05.829239    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:05.829250    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:05.829257    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:05.832698    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:05.832773    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:06.329078    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:06.329103    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:06.329116    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:06.329123    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:06.332045    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:06.830238    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:06.830261    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:06.830306    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:06.830316    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:06.833538    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:07.330777    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:07.330798    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:07.330808    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:07.330815    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:07.334065    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:07.829264    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:07.829288    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:07.829330    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:07.829338    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:07.832368    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:08.329114    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:08.329178    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:08.329193    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:08.329211    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:08.332086    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:08.332156    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:08.829364    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:08.829385    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:08.829397    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:08.829404    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:08.832446    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:09.328860    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:09.328872    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:09.328878    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:09.328881    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:09.331153    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:09.829450    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:09.829472    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:09.829482    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:09.829490    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:09.839325    4003 round_trippers.go:574] Response Status: 404 Not Found in 9 milliseconds
	I0831 15:48:10.329202    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:10.329227    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:10.329290    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:10.329300    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:10.336072    4003 round_trippers.go:574] Response Status: 404 Not Found in 6 milliseconds
	I0831 15:48:10.336141    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:10.829298    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:10.829320    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:10.829333    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:10.829339    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:10.832656    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:11.328862    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:11.328879    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:11.328886    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:11.328890    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:11.331251    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:11.828789    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:11.828814    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:11.828825    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:11.828830    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:11.831875    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:12.329621    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:12.329641    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:12.329652    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:12.329657    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:12.332812    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:12.829177    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:12.829198    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:12.829209    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:12.829215    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:12.832205    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:12.832271    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:13.329690    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:13.329709    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:13.329721    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:13.329726    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:13.332350    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:13.830163    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:13.830187    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:13.830200    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:13.830207    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:13.833785    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:14.330813    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:14.330871    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:14.330889    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:14.330897    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:14.333729    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:14.829241    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:14.829256    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:14.829265    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:14.829271    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:14.831656    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:15.329102    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:15.329117    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:15.329125    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:15.329128    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:15.331035    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:48:15.331094    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:15.829453    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:15.829477    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:15.829490    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:15.829498    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:15.832921    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:16.330482    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:16.330501    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:16.330512    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:16.330519    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:16.333392    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:16.829494    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:16.829514    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:16.829526    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:16.829531    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:16.832666    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:17.328819    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:17.328832    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:17.328838    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:17.328842    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:17.332907    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:48:17.333002    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:17.830033    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:17.830052    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:17.830063    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:17.830071    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:17.833459    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:18.330056    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:18.330077    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:18.330089    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:18.330094    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:18.333155    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:18.830388    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:18.830402    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:18.830408    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:18.830411    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:18.832447    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:19.329634    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:19.329659    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:19.329671    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:19.329677    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:19.333012    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:19.333085    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:19.829599    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:19.829619    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:19.829631    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:19.829639    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:19.833057    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:20.330129    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:20.330145    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:20.330151    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:20.330154    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:20.331920    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:48:20.829042    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:20.829056    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:20.829065    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:20.829069    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:20.831640    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:21.330321    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:21.330345    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:21.330357    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:21.330364    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:21.333593    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:21.333742    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:21.829489    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:21.829509    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:21.829521    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:21.829528    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:21.832949    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:22.329074    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:22.329097    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:22.329109    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:22.329115    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:22.332552    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:22.829496    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:22.829514    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:22.829523    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:22.829528    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:22.831769    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:23.329638    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:23.329654    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:23.329662    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:23.329666    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:23.332063    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:23.830053    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:23.830067    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:23.830105    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:23.830115    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:23.832192    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:23.832251    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:24.329240    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:24.329260    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:24.329272    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:24.329277    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:24.332009    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:24.830470    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:24.830482    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:24.830488    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:24.830491    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:24.835168    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:48:25.330931    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:25.330957    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:25.330968    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:25.330974    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:25.334396    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:25.830021    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:25.830047    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:25.830057    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:25.830063    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:25.833612    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:25.833684    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:26.330695    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:26.330715    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:26.330726    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:26.330733    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:26.333858    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:26.829799    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:26.829824    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:26.829833    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:26.829838    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:26.833084    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:27.329417    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:27.329439    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:27.329450    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:27.329457    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:27.333005    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:27.829654    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:27.829674    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:27.829685    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:27.829693    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:27.832427    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:28.329524    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:28.329539    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:28.329580    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:28.329585    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:28.331632    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:28.331748    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:28.829893    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:28.829913    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:28.829925    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:28.829932    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:28.833039    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:29.329166    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:29.329185    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:29.329193    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:29.329197    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:29.331783    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:29.829024    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:29.829051    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:29.829062    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:29.829070    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:29.832264    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:30.328905    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:30.328931    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:30.328942    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:30.328947    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:30.332052    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:30.332123    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:30.830052    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:30.830072    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:30.830082    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:30.830091    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:30.833325    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:31.330324    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:31.330348    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:31.330360    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:31.330365    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:31.333570    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:31.830355    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:31.830379    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:31.830391    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:31.830448    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:31.833417    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:32.330044    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:32.330081    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:32.330090    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:32.330097    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:32.332188    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:32.332242    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:32.828972    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:32.828987    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:32.828994    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:32.828997    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:32.830746    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:48:33.330302    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:33.330324    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:33.330335    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:33.330342    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:33.333187    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:33.828871    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:33.828885    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:33.828891    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:33.828894    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:33.830679    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:48:34.329246    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:34.329269    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:34.329284    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:34.329293    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:34.332379    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:34.332447    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:34.828888    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:34.828903    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:34.828936    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:34.828941    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:34.836178    4003 round_trippers.go:574] Response Status: 404 Not Found in 7 milliseconds
	I0831 15:48:35.330611    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:35.330647    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:35.330655    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:35.330658    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:35.333046    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:35.829308    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:35.829333    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:35.829344    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:35.829352    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:35.832682    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:36.329920    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:36.329937    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:36.329976    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:36.329982    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:36.332428    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:36.332513    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:36.830494    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:36.830509    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:36.830515    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:36.830550    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:36.832561    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:37.329913    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:37.329933    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:37.329944    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:37.329949    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:37.332838    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:37.829024    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:37.829050    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:37.829062    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:37.829069    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:37.832669    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:38.330684    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:38.330699    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:38.330705    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:38.330708    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:38.332762    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:38.332823    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:38.829400    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:38.829426    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:38.829444    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:38.829450    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:38.832697    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:39.330303    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:39.330331    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:39.330342    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:39.330348    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:39.333360    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:39.829748    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:39.829768    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:39.829777    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:39.829781    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:39.832089    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:40.328868    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:40.328892    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:40.328903    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:40.328908    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:40.331956    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:40.829153    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:40.829180    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:40.829192    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:40.829199    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:40.832739    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:40.832818    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:41.330714    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:41.330729    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:41.330735    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:41.330738    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:41.332850    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:41.829181    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:41.829207    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:41.829217    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:41.829225    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:41.832653    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:42.330611    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:42.330634    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:42.330646    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:42.330655    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:42.334145    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:42.830611    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:42.830650    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:42.830658    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:42.830662    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:42.832630    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:48:43.329836    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:43.329858    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:43.329870    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:43.329877    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:43.333122    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:43.333193    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:43.829159    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:43.829183    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:43.829194    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:43.829200    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:43.832264    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:44.330509    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:44.330524    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:44.330531    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:44.330537    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:44.332882    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:44.829657    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:44.829680    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:44.829695    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:44.829700    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:44.832675    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:45.329175    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:45.329200    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:45.329211    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:45.329217    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:45.332400    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:45.829172    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:45.829184    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:45.829191    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:45.829194    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:45.831511    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:45.831573    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:46.329275    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:46.329302    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:46.329312    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:46.329318    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:46.332403    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:46.829488    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:46.829509    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:46.829521    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:46.829527    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:46.832727    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:47.329181    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:47.329197    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:47.329202    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:47.329205    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:47.331729    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:47.829140    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:47.829163    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:47.829175    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:47.829182    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:47.832590    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:47.832666    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:48.329582    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:48.329624    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:48.329632    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:48.329639    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:48.332262    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:48.829927    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:48.829940    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:48.829948    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:48.829951    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:48.832095    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:49.329030    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:49.329054    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:49.329067    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:49.329073    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:49.331713    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:49.829998    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:49.830024    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:49.830035    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:49.830042    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:49.833387    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:49.833457    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:50.329328    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:50.329345    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:50.329351    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:50.329355    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:50.331789    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:50.829290    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:50.829312    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:50.829323    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:50.829327    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:50.832450    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:51.329373    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:51.329396    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:51.329407    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:51.329413    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:51.332584    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:51.828974    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:51.828993    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:51.828999    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:51.829002    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:51.831143    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:52.329568    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:52.329582    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:52.329588    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:52.329591    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:52.331474    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:48:52.331532    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:52.828983    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:52.829009    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:52.829020    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:52.829027    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:52.831923    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:53.330254    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:53.330266    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:53.330272    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:53.330275    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:53.332376    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:53.829955    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:53.829977    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:53.829986    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:53.829991    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:53.833487    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:54.330025    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:54.330048    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:54.330058    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:54.330064    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:54.332846    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:54.332916    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:54.829445    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:54.829461    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:54.829469    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:54.829473    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:54.831681    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:55.330304    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:55.330329    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:55.330339    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:55.330343    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:55.333464    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:55.829335    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:55.829357    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:55.829372    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:55.829379    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:55.832747    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:56.329562    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:56.329574    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:56.329580    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:56.329583    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:56.331549    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:48:56.830534    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:56.830555    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:56.830566    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:56.830571    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:56.834033    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:56.834111    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:57.329183    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:57.329210    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:57.329220    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:57.329229    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:57.332084    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:57.829250    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:57.829263    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:57.829269    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:57.829273    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:57.831424    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:58.329091    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:58.329116    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:58.329126    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:58.329131    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:58.331697    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:58.831142    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:58.831167    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:58.831178    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:58.831185    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:58.834492    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:58.834557    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:59.329237    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:59.329252    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:59.329257    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:59.329261    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:59.331512    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:59.829320    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:59.829342    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:59.829353    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:59.829359    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:59.832197    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:00.330432    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:00.330451    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:00.330462    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:00.330471    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:00.333067    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:00.829237    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:00.829253    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:00.829260    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:00.829263    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:00.831418    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:01.329358    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:01.329379    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:01.329388    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:01.329393    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:01.332371    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:01.332438    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:49:01.830578    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:01.830604    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:01.830617    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:01.830623    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:01.834159    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:02.329157    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:02.329173    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:02.329179    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:02.329182    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:02.331067    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:49:02.831085    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:02.831112    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:02.831123    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:02.831130    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:02.834437    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:03.331085    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:03.331109    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:03.331152    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:03.331159    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:03.334347    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:03.334422    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:49:03.829836    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:03.829853    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:03.829859    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:03.829863    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:03.831902    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:04.331065    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:04.331089    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:04.331100    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:04.331107    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:04.334167    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:04.831234    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:04.831261    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:04.831273    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:04.831279    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:04.834602    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:05.330136    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:05.330151    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:05.330157    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:05.330160    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:05.332374    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:05.830128    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:05.830150    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:05.830165    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:05.830171    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:05.834152    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:05.834213    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:49:06.329879    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:06.329904    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:06.329915    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:06.329924    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:06.332822    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:06.829369    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:06.829385    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:06.829390    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:06.829393    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:06.831713    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:07.329339    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:07.329361    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:07.329373    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:07.329380    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:07.332647    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:07.830352    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:07.830380    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:07.830437    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:07.830448    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:07.833556    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:08.329058    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:08.329073    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:08.329079    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:08.329082    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:08.331089    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:49:08.331148    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:49:08.830337    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:08.830361    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:08.830372    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:08.830379    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:08.833728    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:08.833817    4003 node_ready.go:38] duration metric: took 4m0.004911985s for node "ha-949000-m04" to be "Ready" ...
	I0831 15:49:08.856471    4003 out.go:201] 
	W0831 15:49:08.878133    4003 out.go:270] X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	W0831 15:49:08.878147    4003 out.go:270] * 
	W0831 15:49:08.878920    4003 out.go:293] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0831 15:49:08.943376    4003 out.go:201] 
	
	
	==> Docker <==
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.332263033Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.370445559Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.370708492Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.370824991Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.371374304Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.371365499Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.371690677Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.371839495Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.372326970Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.374135025Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.379001438Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.379117671Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.381398964Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.411323783Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.411385669Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.411398736Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.411510078Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:45:09 ha-949000 dockerd[1154]: time="2024-08-31T22:45:09.824046002Z" level=info msg="ignoring event" container=216b25e04efdd68fa78ff1cfc79456f27ab236602c5e05f800a59fa3cb220480 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Aug 31 22:45:09 ha-949000 dockerd[1161]: time="2024-08-31T22:45:09.824322056Z" level=info msg="shim disconnected" id=216b25e04efdd68fa78ff1cfc79456f27ab236602c5e05f800a59fa3cb220480 namespace=moby
	Aug 31 22:45:09 ha-949000 dockerd[1161]: time="2024-08-31T22:45:09.824375729Z" level=warning msg="cleaning up after shim disconnected" id=216b25e04efdd68fa78ff1cfc79456f27ab236602c5e05f800a59fa3cb220480 namespace=moby
	Aug 31 22:45:09 ha-949000 dockerd[1161]: time="2024-08-31T22:45:09.824925130Z" level=info msg="cleaning up dead shim" namespace=moby
	Aug 31 22:45:23 ha-949000 dockerd[1161]: time="2024-08-31T22:45:23.385665751Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:45:23 ha-949000 dockerd[1161]: time="2024-08-31T22:45:23.385739452Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:45:23 ha-949000 dockerd[1161]: time="2024-08-31T22:45:23.385752198Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:45:23 ha-949000 dockerd[1161]: time="2024-08-31T22:45:23.385842000Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	11a121a84e236       6e38f40d628db       3 minutes ago       Running             storage-provisioner       4                   675a87e7bbf1d       storage-provisioner
	18fa81194c803       8c811b4aec35f       4 minutes ago       Running             busybox                   2                   a1fb1144f3287       busybox-7dff88458-5kkbw
	5b45844943a70       cbb01a7bd410d       4 minutes ago       Running             coredns                   2                   4ab6f492ffa53       coredns-6f6b679f8f-snq8s
	39caece4a1a06       12968670680f4       4 minutes ago       Running             kindnet-cni               2                   84921ed532424       kindnet-jzj42
	216b25e04efdd       6e38f40d628db       4 minutes ago       Exited              storage-provisioner       3                   675a87e7bbf1d       storage-provisioner
	92325d0ba5d32       cbb01a7bd410d       4 minutes ago       Running             coredns                   2                   9a17b13011ad6       coredns-6f6b679f8f-kjszm
	ce00ce382bb0c       ad83b2ca7b09e       4 minutes ago       Running             kube-proxy                2                   563c95c71d5ae       kube-proxy-q7ndn
	ca5e9a101fac2       045733566833c       5 minutes ago       Running             kube-controller-manager   4                   0976cb0a1281b       kube-controller-manager-ha-949000
	8be9164123bc9       604f5db92eaa8       5 minutes ago       Running             kube-apiserver            3                   e0447c649afe4       kube-apiserver-ha-949000
	6c320a1f78aee       1766f54c897f0       5 minutes ago       Running             kube-scheduler            2                   515614d004b25       kube-scheduler-ha-949000
	c016f5fcb7d72       2e96e5913fc06       5 minutes ago       Running             etcd                      2                   716e9fa824b03       etcd-ha-949000
	981e8e790a392       045733566833c       5 minutes ago       Exited              kube-controller-manager   3                   0976cb0a1281b       kube-controller-manager-ha-949000
	23e342681c007       38af8ddebf499       5 minutes ago       Running             kube-vip                  1                   87b3e236006c5       kube-vip-ha-949000
	6966a01f96234       604f5db92eaa8       5 minutes ago       Exited              kube-apiserver            2                   e0447c649afe4       kube-apiserver-ha-949000
	f5deb862745e4       8c811b4aec35f       11 minutes ago      Exited              busybox                   1                   88b8aff8a006d       busybox-7dff88458-5kkbw
	f89b862064139       ad83b2ca7b09e       11 minutes ago      Exited              kube-proxy                1                   eb9132907eda4       kube-proxy-q7ndn
	ac487ac32c364       cbb01a7bd410d       11 minutes ago      Exited              coredns                   1                   b2a8128cbfc29       coredns-6f6b679f8f-snq8s
	ff98d7e38a1e6       12968670680f4       11 minutes ago      Exited              kindnet-cni               1                   fc1aa95e54f86       kindnet-jzj42
	c4dc6059b2150       cbb01a7bd410d       11 minutes ago      Exited              coredns                   1                   9b710526ef4f9       coredns-6f6b679f8f-kjszm
	5b0ac6b7faf7d       1766f54c897f0       12 minutes ago      Exited              kube-scheduler            1                   6e330e66cf27f       kube-scheduler-ha-949000
	2255978551ea3       2e96e5913fc06       12 minutes ago      Exited              etcd                      1                   d62930734f2f9       etcd-ha-949000
	0bb147eb5f408       38af8ddebf499       12 minutes ago      Exited              kube-vip                  0                   9ac139ab4844d       kube-vip-ha-949000
	
	
	==> coredns [5b45844943a7] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 257e111468ef6f1e36f10df061303186c353cd0e51aed8f50f4e4fd21cec02687aef97084fe1f82262f5cee88179d311670a6ae21ae185759728216fc264125f
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:47900 - 36884 "HINFO IN 2333551711870933102.2340796284351020766. udp 57 false 512" NXDOMAIN qr,rd,ra 132 0.008323198s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1041136774]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:44:39.776) (total time: 30000ms):
	Trace[1041136774]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (22:45:09.776)
	Trace[1041136774]: [30.000488845s] [30.000488845s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[2116759242]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:44:39.776) (total time: 30000ms):
	Trace[2116759242]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (22:45:09.777)
	Trace[2116759242]: [30.00030351s] [30.00030351s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[693026538]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:44:39.777) (total time: 30000ms):
	Trace[693026538]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (22:45:09.777)
	Trace[693026538]: [30.000248071s] [30.000248071s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [92325d0ba5d3] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 257e111468ef6f1e36f10df061303186c353cd0e51aed8f50f4e4fd21cec02687aef97084fe1f82262f5cee88179d311670a6ae21ae185759728216fc264125f
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:37396 - 48689 "HINFO IN 9162885205725873992.3311076006694622340. udp 57 false 512" NXDOMAIN qr,rd,ra 132 0.008859861s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[180755621]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:44:39.768) (total time: 30001ms):
	Trace[180755621]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (22:45:09.769)
	Trace[180755621]: [30.001189401s] [30.001189401s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1144270708]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:44:39.770) (total time: 30001ms):
	Trace[1144270708]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (22:45:09.772)
	Trace[1144270708]: [30.001530888s] [30.001530888s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[735366369]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:44:39.772) (total time: 30000ms):
	Trace[735366369]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (22:45:09.773)
	Trace[735366369]: [30.000672378s] [30.000672378s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [ac487ac32c36] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 257e111468ef6f1e36f10df061303186c353cd0e51aed8f50f4e4fd21cec02687aef97084fe1f82262f5cee88179d311670a6ae21ae185759728216fc264125f
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:37668 - 17883 "HINFO IN 4931414995021238036.4254872758042696539. udp 57 false 512" NXDOMAIN qr,rd,ra 132 0.026863898s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1645472327]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.837) (total time: 30003ms):
	Trace[1645472327]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30002ms (22:37:45.839)
	Trace[1645472327]: [30.003429832s] [30.003429832s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[2054948566]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.838) (total time: 30003ms):
	Trace[2054948566]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30003ms (22:37:45.841)
	Trace[2054948566]: [30.003549662s] [30.003549662s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[850581595]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.840) (total time: 30001ms):
	Trace[850581595]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (22:37:45.841)
	Trace[850581595]: [30.001289039s] [30.001289039s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> coredns [c4dc6059b215] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 257e111468ef6f1e36f10df061303186c353cd0e51aed8f50f4e4fd21cec02687aef97084fe1f82262f5cee88179d311670a6ae21ae185759728216fc264125f
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:55597 - 61955 "HINFO IN 5411809642052316829.545085282119266902. udp 56 false 512" NXDOMAIN qr,rd,ra 131 0.026601414s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1248174265]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.837) (total time: 30003ms):
	Trace[1248174265]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30002ms (22:37:45.839)
	Trace[1248174265]: [30.003765448s] [30.003765448s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[313955954]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.840) (total time: 30001ms):
	Trace[313955954]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (22:37:45.841)
	Trace[313955954]: [30.001623019s] [30.001623019s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1099528094]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.837) (total time: 30004ms):
	Trace[1099528094]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30004ms (22:37:45.842)
	Trace[1099528094]: [30.004679878s] [30.004679878s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> describe nodes <==
	Name:               ha-949000
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-949000
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=ha-949000
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_08_31T15_29_45_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:29:41 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-949000
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:49:06 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:44:10 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:44:10 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:44:10 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:44:10 +0000   Sat, 31 Aug 2024 22:37:06 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.5
	  Hostname:    ha-949000
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 758fb98d149341c7ae245ce9491d8a0f
	  System UUID:                98ca49d1-0000-0000-9e6c-321a4533d56e
	  Boot ID:                    3fc4eb3a-1e97-462c-91b1-b27289849703
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-5kkbw              0 (0%)        0 (0%)      0 (0%)           0 (0%)         16m
	  kube-system                 coredns-6f6b679f8f-kjszm             100m (5%)     0 (0%)      70Mi (3%)        170Mi (8%)     19m
	  kube-system                 coredns-6f6b679f8f-snq8s             100m (5%)     0 (0%)      70Mi (3%)        170Mi (8%)     19m
	  kube-system                 etcd-ha-949000                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         19m
	  kube-system                 kindnet-jzj42                        100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      19m
	  kube-system                 kube-apiserver-ha-949000             250m (12%)    0 (0%)      0 (0%)           0 (0%)         19m
	  kube-system                 kube-controller-manager-ha-949000    200m (10%)    0 (0%)      0 (0%)           0 (0%)         19m
	  kube-system                 kube-proxy-q7ndn                     0 (0%)        0 (0%)      0 (0%)           0 (0%)         19m
	  kube-system                 kube-scheduler-ha-949000             100m (5%)     0 (0%)      0 (0%)           0 (0%)         19m
	  kube-system                 kube-vip-ha-949000                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         11m
	  kube-system                 storage-provisioner                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         19m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%)   100m (5%)
	  memory             290Mi (13%)  390Mi (18%)
	  ephemeral-storage  0 (0%)       0 (0%)
	  hugepages-2Mi      0 (0%)       0 (0%)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 11m                    kube-proxy       
	  Normal  Starting                 19m                    kube-proxy       
	  Normal  Starting                 4m31s                  kube-proxy       
	  Normal  NodeHasNoDiskPressure    19m                    kubelet          Node ha-949000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     19m                    kubelet          Node ha-949000 status is now: NodeHasSufficientPID
	  Normal  Starting                 19m                    kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  19m                    kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  19m                    kubelet          Node ha-949000 status is now: NodeHasSufficientMemory
	  Normal  RegisteredNode           19m                    node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  NodeReady                19m                    kubelet          Node ha-949000 status is now: NodeReady
	  Normal  RegisteredNode           18m                    node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           17m                    node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           14m                    node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  Starting                 12m                    kubelet          Starting kubelet.
	  Normal  NodeHasNoDiskPressure    12m (x8 over 12m)      kubelet          Node ha-949000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientMemory  12m (x8 over 12m)      kubelet          Node ha-949000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasSufficientPID     12m (x7 over 12m)      kubelet          Node ha-949000 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  12m                    kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           12m                    node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           11m                    node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           11m                    node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  NodeHasSufficientPID     5m57s (x7 over 5m57s)  kubelet          Node ha-949000 status is now: NodeHasSufficientPID
	  Normal  Starting                 5m57s                  kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  5m57s (x8 over 5m57s)  kubelet          Node ha-949000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    5m57s (x8 over 5m57s)  kubelet          Node ha-949000 status is now: NodeHasNoDiskPressure
	  Normal  NodeAllocatableEnforced  5m57s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           5m2s                   node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           5m2s                   node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	
	
	Name:               ha-949000-m02
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-949000-m02
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=ha-949000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_08_31T15_30_43_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:30:41 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-949000-m02
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:49:03 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:44:07 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:44:07 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:44:07 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:44:07 +0000   Sat, 31 Aug 2024 22:31:00 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.6
	  Hostname:    ha-949000-m02
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 65e22cd2b0314498aa33bf9e04730c6a
	  System UUID:                23e54f3d-0000-0000-86b7-b25c818528d1
	  Boot ID:                    1d744b30-5098-4929-bff2-54bd26848d21
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-6r9s5                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         16m
	  kube-system                 etcd-ha-949000-m02                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         18m
	  kube-system                 kindnet-brtj6                            100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      18m
	  kube-system                 kube-apiserver-ha-949000-m02             250m (12%)    0 (0%)      0 (0%)           0 (0%)         18m
	  kube-system                 kube-controller-manager-ha-949000-m02    200m (10%)    0 (0%)      0 (0%)           0 (0%)         18m
	  kube-system                 kube-proxy-4r2bt                         0 (0%)        0 (0%)      0 (0%)           0 (0%)         18m
	  kube-system                 kube-scheduler-ha-949000-m02             100m (5%)     0 (0%)      0 (0%)           0 (0%)         18m
	  kube-system                 kube-vip-ha-949000-m02                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         18m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%)  100m (5%)
	  memory             150Mi (7%)  50Mi (2%)
	  ephemeral-storage  0 (0%)      0 (0%)
	  hugepages-2Mi      0 (0%)      0 (0%)
	Events:
	  Type     Reason                   Age                    From             Message
	  ----     ------                   ----                   ----             -------
	  Normal   Starting                 4m46s                  kube-proxy       
	  Normal   Starting                 18m                    kube-proxy       
	  Normal   Starting                 14m                    kube-proxy       
	  Normal   Starting                 12m                    kube-proxy       
	  Normal   NodeHasNoDiskPressure    18m (x8 over 18m)      kubelet          Node ha-949000-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     18m (x7 over 18m)      kubelet          Node ha-949000-m02 status is now: NodeHasSufficientPID
	  Normal   NodeAllocatableEnforced  18m                    kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  18m (x8 over 18m)      kubelet          Node ha-949000-m02 status is now: NodeHasSufficientMemory
	  Normal   RegisteredNode           18m                    node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   RegisteredNode           18m                    node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   RegisteredNode           17m                    node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   Starting                 14m                    kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  14m                    kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  14m                    kubelet          Node ha-949000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    14m                    kubelet          Node ha-949000-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     14m                    kubelet          Node ha-949000-m02 status is now: NodeHasSufficientPID
	  Warning  Rebooted                 14m                    kubelet          Node ha-949000-m02 has been rebooted, boot id: 4ddbe4b0-7ef0-4715-a631-f977c123c463
	  Normal   RegisteredNode           14m                    node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   Starting                 12m                    kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  12m                    kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientPID     12m (x7 over 12m)      kubelet          Node ha-949000-m02 status is now: NodeHasSufficientPID
	  Normal   NodeHasSufficientMemory  12m (x8 over 12m)      kubelet          Node ha-949000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    12m (x8 over 12m)      kubelet          Node ha-949000-m02 status is now: NodeHasNoDiskPressure
	  Normal   RegisteredNode           12m                    node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   RegisteredNode           11m                    node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   RegisteredNode           11m                    node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   Starting                 5m14s                  kubelet          Starting kubelet.
	  Normal   NodeHasSufficientMemory  5m14s (x8 over 5m14s)  kubelet          Node ha-949000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    5m14s (x8 over 5m14s)  kubelet          Node ha-949000-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     5m14s (x7 over 5m14s)  kubelet          Node ha-949000-m02 status is now: NodeHasSufficientPID
	  Normal   NodeAllocatableEnforced  5m14s                  kubelet          Updated Node Allocatable limit across pods
	  Normal   RegisteredNode           5m2s                   node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   RegisteredNode           5m2s                   node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	
	
	==> dmesg <==
	[  +0.000001] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.034690] ACPI BIOS Warning (bug): Incorrect checksum in table [DSDT] - 0xBE, should be 0x1B (20200925/tbprint-173)
	[  +0.008037] RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible!
	[Aug31 22:43] ACPI Error: Could not enable RealTimeClock event (20200925/evxfevnt-182)
	[  +0.000000] ACPI Warning: Could not enable fixed event - RealTimeClock (4) (20200925/evxface-618)
	[  +0.006696] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.610055] systemd-fstab-generator[126]: Ignoring "noauto" option for root device
	[  +2.275477] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000012] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000001] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +0.654488] systemd-fstab-generator[460]: Ignoring "noauto" option for root device
	[  +0.100018] systemd-fstab-generator[472]: Ignoring "noauto" option for root device
	[  +1.963372] systemd-fstab-generator[1084]: Ignoring "noauto" option for root device
	[  +0.243624] systemd-fstab-generator[1119]: Ignoring "noauto" option for root device
	[  +0.053609] kauditd_printk_skb: 101 callbacks suppressed
	[  +0.048738] systemd-fstab-generator[1131]: Ignoring "noauto" option for root device
	[  +0.109338] systemd-fstab-generator[1145]: Ignoring "noauto" option for root device
	[  +2.485755] systemd-fstab-generator[1361]: Ignoring "noauto" option for root device
	[  +0.105237] systemd-fstab-generator[1373]: Ignoring "noauto" option for root device
	[  +0.097354] systemd-fstab-generator[1385]: Ignoring "noauto" option for root device
	[  +0.120488] systemd-fstab-generator[1400]: Ignoring "noauto" option for root device
	[  +0.432211] systemd-fstab-generator[1563]: Ignoring "noauto" option for root device
	[  +6.838586] kauditd_printk_skb: 212 callbacks suppressed
	[ +21.319655] kauditd_printk_skb: 40 callbacks suppressed
	[Aug31 22:45] kauditd_printk_skb: 78 callbacks suppressed
	
	
	==> etcd [2255978551ea] <==
	{"level":"warn","ts":"2024-08-31T22:42:48.058603Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"5.232147243s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/volumeattachments/\" range_end:\"/registry/volumeattachments0\" count_only:true ","response":"","error":"context canceled"}
	{"level":"info","ts":"2024-08-31T22:42:48.058634Z","caller":"traceutil/trace.go:171","msg":"trace[1372143637] range","detail":"{range_begin:/registry/volumeattachments/; range_end:/registry/volumeattachments0; }","duration":"5.232181152s","start":"2024-08-31T22:42:42.826450Z","end":"2024-08-31T22:42:48.058631Z","steps":["trace[1372143637] 'agreement among raft nodes before linearized reading'  (duration: 5.232147269s)"],"step_count":1}
	{"level":"warn","ts":"2024-08-31T22:42:48.058649Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-08-31T22:42:42.826415Z","time spent":"5.232228764s","remote":"127.0.0.1:48278","response type":"/etcdserverpb.KV/Range","request count":0,"request size":62,"response count":0,"response size":0,"request content":"key:\"/registry/volumeattachments/\" range_end:\"/registry/volumeattachments0\" count_only:true "}
	2024/08/31 22:42:48 WARNING: [core] [Server #7] grpc: Server.processUnaryRPC failed to write status: connection error: desc = "transport is closing"
	{"level":"warn","ts":"2024-08-31T22:42:48.058725Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"4.593098977s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/apiextensions.k8s.io/customresourcedefinitions/\" range_end:\"/registry/apiextensions.k8s.io/customresourcedefinitions0\" count_only:true ","response":"","error":"context canceled"}
	{"level":"info","ts":"2024-08-31T22:42:48.058739Z","caller":"traceutil/trace.go:171","msg":"trace[371527565] range","detail":"{range_begin:/registry/apiextensions.k8s.io/customresourcedefinitions/; range_end:/registry/apiextensions.k8s.io/customresourcedefinitions0; }","duration":"4.593114862s","start":"2024-08-31T22:42:43.465620Z","end":"2024-08-31T22:42:48.058734Z","steps":["trace[371527565] 'agreement among raft nodes before linearized reading'  (duration: 4.593098849s)"],"step_count":1}
	{"level":"warn","ts":"2024-08-31T22:42:48.058751Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-08-31T22:42:43.465603Z","time spent":"4.593143993s","remote":"127.0.0.1:47898","response type":"/etcdserverpb.KV/Range","request count":0,"request size":120,"response count":0,"response size":0,"request content":"key:\"/registry/apiextensions.k8s.io/customresourcedefinitions/\" range_end:\"/registry/apiextensions.k8s.io/customresourcedefinitions0\" count_only:true "}
	2024/08/31 22:42:48 WARNING: [core] [Server #7] grpc: Server.processUnaryRPC failed to write status: connection error: desc = "transport is closing"
	{"level":"warn","ts":"2024-08-31T22:42:48.058755Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"1.313645842s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/health\" ","response":"","error":"context canceled"}
	{"level":"info","ts":"2024-08-31T22:42:48.058776Z","caller":"traceutil/trace.go:171","msg":"trace[1159639805] range","detail":"{range_begin:/registry/health; range_end:; }","duration":"1.313669945s","start":"2024-08-31T22:42:46.745100Z","end":"2024-08-31T22:42:48.058770Z","steps":["trace[1159639805] 'agreement among raft nodes before linearized reading'  (duration: 1.313645254s)"],"step_count":1}
	{"level":"warn","ts":"2024-08-31T22:42:48.058793Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-08-31T22:42:46.745083Z","time spent":"1.313705515s","remote":"127.0.0.1:42196","response type":"/etcdserverpb.KV/Range","request count":0,"request size":18,"response count":0,"response size":0,"request content":"key:\"/registry/health\" "}
	2024/08/31 22:42:48 WARNING: [core] [Server #7] grpc: Server.processUnaryRPC failed to write status: connection error: desc = "transport is closing"
	{"level":"warn","ts":"2024-08-31T22:42:48.096976Z","caller":"embed/serve.go:212","msg":"stopping secure grpc server due to error","error":"accept tcp 192.169.0.5:2379: use of closed network connection"}
	{"level":"warn","ts":"2024-08-31T22:42:48.097045Z","caller":"embed/serve.go:214","msg":"stopped secure grpc server due to error","error":"accept tcp 192.169.0.5:2379: use of closed network connection"}
	{"level":"info","ts":"2024-08-31T22:42:48.097084Z","caller":"etcdserver/server.go:1512","msg":"skipped leadership transfer; local server is not leader","local-member-id":"b8c6c7563d17d844","current-leader-member-id":"0"}
	{"level":"info","ts":"2024-08-31T22:42:48.097205Z","caller":"rafthttp/peer.go:330","msg":"stopping remote peer","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:42:48.097236Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:42:48.097251Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream Message","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:42:48.097335Z","caller":"rafthttp/pipeline.go:85","msg":"stopped HTTP pipelining with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:42:48.097380Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:42:48.097428Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:42:48.097439Z","caller":"rafthttp/peer.go:335","msg":"stopped remote peer","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:42:48.098722Z","caller":"embed/etcd.go:581","msg":"stopping serving peer traffic","address":"192.169.0.5:2380"}
	{"level":"info","ts":"2024-08-31T22:42:48.098784Z","caller":"embed/etcd.go:586","msg":"stopped serving peer traffic","address":"192.169.0.5:2380"}
	{"level":"info","ts":"2024-08-31T22:42:48.098805Z","caller":"embed/etcd.go:379","msg":"closed etcd server","name":"ha-949000","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.169.0.5:2380"],"advertise-client-urls":["https://192.169.0.5:2379"]}
	
	
	==> etcd [c016f5fcb7d7] <==
	{"level":"info","ts":"2024-08-31T22:44:05.120780Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from 316786cc150e7430 at term 3"}
	{"level":"info","ts":"2024-08-31T22:44:05.120880Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 has received 2 MsgPreVoteResp votes and 0 vote rejections"}
	{"level":"info","ts":"2024-08-31T22:44:05.120932Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became candidate at term 4"}
	{"level":"info","ts":"2024-08-31T22:44:05.121010Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgVoteResp from b8c6c7563d17d844 at term 4"}
	{"level":"info","ts":"2024-08-31T22:44:05.121054Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 3, index: 3399] sent MsgVote request to 316786cc150e7430 at term 4"}
	{"level":"info","ts":"2024-08-31T22:44:05.127724Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgVoteResp from 316786cc150e7430 at term 4"}
	{"level":"info","ts":"2024-08-31T22:44:05.127758Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 has received 2 MsgVoteResp votes and 0 vote rejections"}
	{"level":"info","ts":"2024-08-31T22:44:05.127768Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became leader at term 4"}
	{"level":"info","ts":"2024-08-31T22:44:05.127774Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: b8c6c7563d17d844 elected leader b8c6c7563d17d844 at term 4"}
	{"level":"warn","ts":"2024-08-31T22:44:05.127917Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"6.689608382s","expected-duration":"100ms","prefix":"read-only range ","request":"limit:1 keys_only:true ","response":"","error":"etcdserver: leader changed"}
	{"level":"info","ts":"2024-08-31T22:44:05.127991Z","caller":"traceutil/trace.go:171","msg":"trace[1148977671] range","detail":"{range_begin:; range_end:; }","duration":"6.689689339s","start":"2024-08-31T22:43:58.438295Z","end":"2024-08-31T22:44:05.127985Z","steps":["trace[1148977671] 'agreement among raft nodes before linearized reading'  (duration: 6.689607072s)"],"step_count":1}
	{"level":"error","ts":"2024-08-31T22:44:05.128131Z","caller":"etcdhttp/health.go:367","msg":"Health check error","path":"/readyz","reason":"[+]serializable_read ok\n[-]linearizable_read failed: etcdserver: leader changed\n[+]data_corruption ok\n","status-code":503,"stacktrace":"go.etcd.io/etcd/server/v3/etcdserver/api/etcdhttp.(*CheckRegistry).installRootHttpEndpoint.newHealthHandler.func2\n\tgo.etcd.io/etcd/server/v3/etcdserver/api/etcdhttp/health.go:367\nnet/http.HandlerFunc.ServeHTTP\n\tnet/http/server.go:2141\nnet/http.(*ServeMux).ServeHTTP\n\tnet/http/server.go:2519\nnet/http.serverHandler.ServeHTTP\n\tnet/http/server.go:2943\nnet/http.(*conn).serve\n\tnet/http/server.go:2014"}
	{"level":"info","ts":"2024-08-31T22:44:05.150958Z","caller":"etcdserver/server.go:2118","msg":"published local member to cluster through raft","local-member-id":"b8c6c7563d17d844","local-member-attributes":"{Name:ha-949000 ClientURLs:[https://192.169.0.5:2379]}","request-path":"/0/members/b8c6c7563d17d844/attributes","cluster-id":"b73189effde9bc63","publish-timeout":"7s"}
	{"level":"info","ts":"2024-08-31T22:44:05.150981Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-08-31T22:44:05.151248Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2024-08-31T22:44:05.151305Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2024-08-31T22:44:05.150995Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-08-31T22:44:05.152353Z","caller":"v3rpc/health.go:61","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2024-08-31T22:44:05.152948Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.169.0.5:2379"}
	{"level":"info","ts":"2024-08-31T22:44:05.153986Z","caller":"v3rpc/health.go:61","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2024-08-31T22:44:05.154699Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"warn","ts":"2024-08-31T22:44:05.156858Z","caller":"embed/config_logging.go:170","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:57552","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2024-08-31T22:44:05.158972Z","caller":"embed/config_logging.go:170","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:57550","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2024-08-31T22:44:05.159979Z","caller":"embed/config_logging.go:170","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:57566","server-name":"","error":"EOF"}
	{"level":"info","ts":"2024-08-31T22:47:07.419387Z","caller":"traceutil/trace.go:171","msg":"trace[1165805083] transaction","detail":"{read_only:false; response_revision:3483; number_of_response:1; }","duration":"106.737728ms","start":"2024-08-31T22:47:07.312582Z","end":"2024-08-31T22:47:07.419319Z","steps":["trace[1165805083] 'process raft request'  (duration: 106.170956ms)"],"step_count":1}
	
	
	==> kernel <==
	 22:49:11 up 6 min,  0 users,  load average: 0.14, 0.15, 0.08
	Linux ha-949000 5.10.207 #1 SMP Wed Aug 28 20:54:17 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [39caece4a1a0] <==
	I0831 22:48:10.861764       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:48:20.863509       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:48:20.863727       1 main.go:299] handling current node
	I0831 22:48:20.863784       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:48:20.863924       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:48:30.864190       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:48:30.864260       1 main.go:299] handling current node
	I0831 22:48:30.864277       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:48:30.864286       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:48:40.856623       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:48:40.856692       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:48:40.857135       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:48:40.857188       1 main.go:299] handling current node
	I0831 22:48:50.862540       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:48:50.862580       1 main.go:299] handling current node
	I0831 22:48:50.862592       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:48:50.862596       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:49:00.863541       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:49:00.863854       1 main.go:299] handling current node
	I0831 22:49:00.864119       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:49:00.864284       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:49:10.855145       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:49:10.855391       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:49:10.855614       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:49:10.855724       1 main.go:299] handling current node
	
	
	==> kindnet [ff98d7e38a1e] <==
	I0831 22:42:06.419355       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:42:06.419448       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:42:06.419540       1 main.go:299] handling current node
	I0831 22:42:06.419587       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:42:06.419596       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:42:16.418758       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:42:16.418878       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:42:16.419144       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:42:16.419199       1 main.go:299] handling current node
	I0831 22:42:16.419230       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:42:16.419256       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:42:26.418790       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:42:26.418914       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:42:26.419229       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:42:26.419399       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:42:26.419700       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:42:26.419804       1 main.go:299] handling current node
	I0831 22:42:36.424537       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:42:36.424582       1 main.go:299] handling current node
	I0831 22:42:36.424595       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:42:36.424600       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:42:46.420454       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:42:46.420626       1 main.go:299] handling current node
	I0831 22:42:46.420750       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:42:46.420997       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	
	
	==> kube-apiserver [6966a01f9623] <==
	I0831 22:43:21.428428       1 options.go:228] external host was not specified, using 192.169.0.5
	I0831 22:43:21.432400       1 server.go:142] Version: v1.31.0
	I0831 22:43:21.432438       1 server.go:144] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:43:22.144103       1 shared_informer.go:313] Waiting for caches to sync for node_authorizer
	I0831 22:43:22.155916       1 shared_informer.go:313] Waiting for caches to sync for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0831 22:43:22.159940       1 plugins.go:157] Loaded 12 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,ServiceAccount,NodeRestriction,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,RuntimeClass,DefaultIngressClass,MutatingAdmissionWebhook.
	I0831 22:43:22.160055       1 plugins.go:160] Loaded 13 validating admission controller(s) successfully in the following order: LimitRanger,ServiceAccount,PodSecurity,Priority,PersistentVolumeClaimResize,RuntimeClass,CertificateApproval,CertificateSigning,ClusterTrustBundleAttest,CertificateSubjectRestriction,ValidatingAdmissionPolicy,ValidatingAdmissionWebhook,ResourceQuota.
	I0831 22:43:22.162610       1 instance.go:232] Using reconciler: lease
	W0831 22:43:42.140091       1 logging.go:55] [core] [Channel #1 SubChannel #3]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	W0831 22:43:42.143285       1 logging.go:55] [core] [Channel #2 SubChannel #4]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	F0831 22:43:42.166382       1 instance.go:225] Error creating leases: error creating storage factory: context deadline exceeded
	
	
	==> kube-apiserver [8be9164123bc] <==
	I0831 22:44:06.003204       1 crd_finalizer.go:269] Starting CRDFinalizer
	I0831 22:44:06.006692       1 crdregistration_controller.go:114] Starting crd-autoregister controller
	I0831 22:44:06.006723       1 shared_informer.go:313] Waiting for caches to sync for crd-autoregister
	I0831 22:44:06.092264       1 shared_informer.go:320] Caches are synced for cluster_authentication_trust_controller
	I0831 22:44:06.092286       1 cache.go:39] Caches are synced for RemoteAvailability controller
	I0831 22:44:06.093114       1 cache.go:39] Caches are synced for LocalAvailability controller
	I0831 22:44:06.093378       1 shared_informer.go:320] Caches are synced for configmaps
	I0831 22:44:06.093412       1 apf_controller.go:382] Running API Priority and Fairness config worker
	I0831 22:44:06.093670       1 apf_controller.go:385] Running API Priority and Fairness periodic rebalancing process
	I0831 22:44:06.100700       1 cache.go:39] Caches are synced for APIServiceRegistrationController controller
	I0831 22:44:06.107001       1 shared_informer.go:320] Caches are synced for crd-autoregister
	I0831 22:44:06.107526       1 aggregator.go:171] initial CRD sync complete...
	I0831 22:44:06.107618       1 autoregister_controller.go:144] Starting autoregister controller
	I0831 22:44:06.107626       1 cache.go:32] Waiting for caches to sync for autoregister controller
	I0831 22:44:06.107667       1 cache.go:39] Caches are synced for autoregister controller
	I0831 22:44:06.117188       1 shared_informer.go:320] Caches are synced for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0831 22:44:06.117360       1 policy_source.go:224] refreshing policies
	I0831 22:44:06.117520       1 handler_discovery.go:450] Starting ResourceDiscoveryManager
	I0831 22:44:06.117786       1 shared_informer.go:320] Caches are synced for node_authorizer
	E0831 22:44:06.126468       1 controller.go:97] Error removing old endpoints from kubernetes service: no API server IP addresses were listed in storage, refusing to erase all endpoints for the kubernetes Service
	I0831 22:44:06.191208       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0831 22:44:06.997070       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	W0831 22:44:07.236923       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.169.0.5]
	I0831 22:44:07.238029       1 controller.go:615] quota admission added evaluator for: endpoints
	I0831 22:44:07.242198       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	
	
	==> kube-controller-manager [981e8e790a39] <==
	I0831 22:43:21.974417       1 serving.go:386] Generated self-signed cert in-memory
	I0831 22:43:22.496926       1 controllermanager.go:197] "Starting" version="v1.31.0"
	I0831 22:43:22.497066       1 controllermanager.go:199] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:43:22.499991       1 secure_serving.go:213] Serving securely on 127.0.0.1:10257
	I0831 22:43:22.500099       1 tlsconfig.go:243] "Starting DynamicServingCertificateController"
	I0831 22:43:22.500173       1 dynamic_cafile_content.go:160] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0831 22:43:22.500184       1 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	E0831 22:43:43.177282       1 controllermanager.go:242] "Error building controller context" err="failed to wait for apiserver being healthy: timed out waiting for the condition: failed to get apiserver /healthz status: Get \"https://192.169.0.5:8443/healthz\": dial tcp 192.169.0.5:8443: connect: connection refused"
	
	
	==> kube-controller-manager [ca5e9a101fac] <==
	E0831 22:44:49.408368       1 gc_controller.go:151] "Failed to get node" err="node \"ha-949000-m03\" not found" logger="pod-garbage-collector-controller" node="ha-949000-m03"
	E0831 22:44:49.408645       1 gc_controller.go:151] "Failed to get node" err="node \"ha-949000-m03\" not found" logger="pod-garbage-collector-controller" node="ha-949000-m03"
	E0831 22:44:49.408845       1 gc_controller.go:151] "Failed to get node" err="node \"ha-949000-m03\" not found" logger="pod-garbage-collector-controller" node="ha-949000-m03"
	E0831 22:44:49.409004       1 gc_controller.go:151] "Failed to get node" err="node \"ha-949000-m03\" not found" logger="pod-garbage-collector-controller" node="ha-949000-m03"
	E0831 22:44:49.409126       1 gc_controller.go:151] "Failed to get node" err="node \"ha-949000-m03\" not found" logger="pod-garbage-collector-controller" node="ha-949000-m03"
	I0831 22:44:49.416943       1 gc_controller.go:342] "PodGC is force deleting Pod" logger="pod-garbage-collector-controller" pod="kube-system/kube-controller-manager-ha-949000-m03"
	I0831 22:44:49.433565       1 gc_controller.go:258] "Forced deletion of orphaned Pod succeeded" logger="pod-garbage-collector-controller" pod="kube-system/kube-controller-manager-ha-949000-m03"
	I0831 22:44:49.433808       1 gc_controller.go:342] "PodGC is force deleting Pod" logger="pod-garbage-collector-controller" pod="kube-system/etcd-ha-949000-m03"
	I0831 22:44:49.447313       1 gc_controller.go:258] "Forced deletion of orphaned Pod succeeded" logger="pod-garbage-collector-controller" pod="kube-system/etcd-ha-949000-m03"
	I0831 22:44:49.447344       1 gc_controller.go:342] "PodGC is force deleting Pod" logger="pod-garbage-collector-controller" pod="kube-system/kube-apiserver-ha-949000-m03"
	I0831 22:44:49.462532       1 gc_controller.go:258] "Forced deletion of orphaned Pod succeeded" logger="pod-garbage-collector-controller" pod="kube-system/kube-apiserver-ha-949000-m03"
	I0831 22:44:49.462566       1 gc_controller.go:342] "PodGC is force deleting Pod" logger="pod-garbage-collector-controller" pod="kube-system/kube-proxy-d45q5"
	I0831 22:44:49.477251       1 gc_controller.go:258] "Forced deletion of orphaned Pod succeeded" logger="pod-garbage-collector-controller" pod="kube-system/kube-proxy-d45q5"
	I0831 22:44:49.477287       1 gc_controller.go:342] "PodGC is force deleting Pod" logger="pod-garbage-collector-controller" pod="kube-system/kube-scheduler-ha-949000-m03"
	I0831 22:44:49.490910       1 gc_controller.go:258] "Forced deletion of orphaned Pod succeeded" logger="pod-garbage-collector-controller" pod="kube-system/kube-scheduler-ha-949000-m03"
	I0831 22:44:49.490945       1 gc_controller.go:342] "PodGC is force deleting Pod" logger="pod-garbage-collector-controller" pod="kube-system/kube-vip-ha-949000-m03"
	I0831 22:44:49.502534       1 gc_controller.go:258] "Forced deletion of orphaned Pod succeeded" logger="pod-garbage-collector-controller" pod="kube-system/kube-vip-ha-949000-m03"
	I0831 22:44:49.502598       1 gc_controller.go:342] "PodGC is force deleting Pod" logger="pod-garbage-collector-controller" pod="kube-system/kindnet-9j85v"
	I0831 22:44:49.516732       1 gc_controller.go:258] "Forced deletion of orphaned Pod succeeded" logger="pod-garbage-collector-controller" pod="kube-system/kindnet-9j85v"
	I0831 22:45:18.658025       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-6f6b679f8f" duration="7.424367ms"
	I0831 22:45:18.658293       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-6f6b679f8f" duration="41.373µs"
	I0831 22:45:18.682021       1 endpointslice_controller.go:344] "Error syncing endpoint slices for service, retrying" logger="endpointslice-controller" key="kube-system/kube-dns" err="failed to update kube-dns-mxss9 EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io \"kube-dns-mxss9\": the object has been modified; please apply your changes to the latest version and try again"
	I0831 22:45:18.682357       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-6f6b679f8f" duration="11.16125ms"
	I0831 22:45:18.682693       1 event.go:377] Event(v1.ObjectReference{Kind:"Service", Namespace:"kube-system", Name:"kube-dns", UID:"c225b6ce-9d24-451b-aa4c-2f6d57886b05", APIVersion:"v1", ResourceVersion:"257", FieldPath:""}): type: 'Warning' reason: 'FailedToUpdateEndpointSlices' Error updating Endpoint Slices for Service kube-system/kube-dns: failed to update kube-dns-mxss9 EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io "kube-dns-mxss9": the object has been modified; please apply your changes to the latest version and try again
	I0831 22:45:18.683281       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-6f6b679f8f" duration="41.578µs"
	
	
	==> kube-proxy [ce00ce382bb0] <==
		add table ip kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	E0831 22:44:39.825017       1 proxier.go:734] "Error cleaning up nftables rules" err=<
		could not run nftables command: /dev/stdin:1:1-25: Error: Could not process rule: Operation not supported
		add table ip6 kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	I0831 22:44:39.836111       1 server.go:677] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	E0831 22:44:39.836175       1 server.go:234] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I0831 22:44:39.866373       1 server_linux.go:146] "No iptables support for family" ipFamily="IPv6"
	I0831 22:44:39.866419       1 server.go:245] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0831 22:44:39.866438       1 server_linux.go:169] "Using iptables Proxier"
	I0831 22:44:39.868916       1 proxier.go:255] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I0831 22:44:39.869454       1 server.go:483] "Version info" version="v1.31.0"
	I0831 22:44:39.869482       1 server.go:485] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:44:39.871479       1 config.go:197] "Starting service config controller"
	I0831 22:44:39.871768       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0831 22:44:39.871891       1 config.go:104] "Starting endpoint slice config controller"
	I0831 22:44:39.871917       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0831 22:44:39.872900       1 config.go:326] "Starting node config controller"
	I0831 22:44:39.872926       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0831 22:44:39.972753       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0831 22:44:39.972790       1 shared_informer.go:320] Caches are synced for service config
	I0831 22:44:39.973169       1 shared_informer.go:320] Caches are synced for node config
	
	
	==> kube-proxy [f89b86206413] <==
		add table ip kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	E0831 22:37:16.195275       1 proxier.go:734] "Error cleaning up nftables rules" err=<
		could not run nftables command: /dev/stdin:1:1-25: Error: Could not process rule: Operation not supported
		add table ip6 kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	I0831 22:37:16.220357       1 server.go:677] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	E0831 22:37:16.220590       1 server.go:234] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I0831 22:37:16.265026       1 server_linux.go:146] "No iptables support for family" ipFamily="IPv6"
	I0831 22:37:16.265177       1 server.go:245] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0831 22:37:16.265305       1 server_linux.go:169] "Using iptables Proxier"
	I0831 22:37:16.268348       1 proxier.go:255] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I0831 22:37:16.268734       1 server.go:483] "Version info" version="v1.31.0"
	I0831 22:37:16.269061       1 server.go:485] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:37:16.272514       1 config.go:197] "Starting service config controller"
	I0831 22:37:16.273450       1 config.go:104] "Starting endpoint slice config controller"
	I0831 22:37:16.273658       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0831 22:37:16.273777       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0831 22:37:16.275413       1 config.go:326] "Starting node config controller"
	I0831 22:37:16.277042       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0831 22:37:16.374257       1 shared_informer.go:320] Caches are synced for service config
	I0831 22:37:16.375624       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0831 22:37:16.377606       1 shared_informer.go:320] Caches are synced for node config
	
	
	==> kube-scheduler [5b0ac6b7faf7] <==
	I0831 22:36:35.937574       1 serving.go:386] Generated self-signed cert in-memory
	W0831 22:36:46.491998       1 authentication.go:370] Error looking up in-cluster authentication configuration: Get "https://192.169.0.5:8443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication": net/http: TLS handshake timeout
	W0831 22:36:46.492020       1 authentication.go:371] Continuing without authentication configuration. This may treat all requests as anonymous.
	W0831 22:36:46.492025       1 authentication.go:372] To require authentication configuration lookup to succeed, set --authentication-tolerate-lookup-failure=false
	I0831 22:36:55.901677       1 server.go:167] "Starting Kubernetes Scheduler" version="v1.31.0"
	I0831 22:36:55.901714       1 server.go:169] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:36:55.904943       1 configmap_cafile_content.go:205] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I0831 22:36:55.905195       1 secure_serving.go:213] Serving securely on 127.0.0.1:10259
	I0831 22:36:55.905729       1 tlsconfig.go:243] "Starting DynamicServingCertificateController"
	I0831 22:36:55.906036       1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I0831 22:36:56.006746       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	E0831 22:42:48.053419       1 run.go:72] "command failed" err="finished without leader elect"
	
	
	==> kube-scheduler [6c320a1f78ae] <==
	W0831 22:44:06.040548       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0831 22:44:06.040633       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicasets\" in API group \"apps\" at the cluster scope" logger="UnhandledError"
	W0831 22:44:06.040747       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	E0831 22:44:06.040846       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Pod: failed to list *v1.Pod: pods is forbidden: User \"system:kube-scheduler\" cannot list resource \"pods\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:44:06.040950       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0831 22:44:06.041038       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csinodes\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError"
	W0831 22:44:06.041162       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	E0831 22:44:06.041213       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Namespace: failed to list *v1.Namespace: namespaces is forbidden: User \"system:kube-scheduler\" cannot list resource \"namespaces\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:44:06.041282       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0831 22:44:06.041331       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumeclaims\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:44:06.041423       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0831 22:44:06.041509       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"statefulsets\" in API group \"apps\" at the cluster scope" logger="UnhandledError"
	W0831 22:44:06.041630       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0831 22:44:06.041716       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError"
	W0831 22:44:06.041839       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	E0831 22:44:06.041890       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicationcontrollers\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:44:06.042112       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PodDisruptionBudget: poddisruptionbudgets.policy is forbidden: User "system:kube-scheduler" cannot list resource "poddisruptionbudgets" in API group "policy" at the cluster scope
	E0831 22:44:06.042218       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: poddisruptionbudgets.policy is forbidden: User \"system:kube-scheduler\" cannot list resource \"poddisruptionbudgets\" in API group \"policy\" at the cluster scope" logger="UnhandledError"
	W0831 22:44:06.042375       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0831 22:44:06.042426       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"storageclasses\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError"
	W0831 22:44:06.042804       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csistoragecapacities" in API group "storage.k8s.io" at the cluster scope
	E0831 22:44:06.042841       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csistoragecapacities\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError"
	W0831 22:44:06.061311       1 reflector.go:561] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0831 22:44:06.061431       1 reflector.go:158] "Unhandled Error" err="runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"extension-apiserver-authentication\" is forbidden: User \"system:kube-scheduler\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\"" logger="UnhandledError"
	I0831 22:44:21.686813       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Aug 31 22:44:38 ha-949000 kubelet[1570]: I0831 22:44:38.437619    1570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory"
	Aug 31 22:45:10 ha-949000 kubelet[1570]: I0831 22:45:10.098206    1570 scope.go:117] "RemoveContainer" containerID="9743646580e076090272f2d7ba4ce73b0321b5f2db9203294907b9d81d5ad94a"
	Aug 31 22:45:10 ha-949000 kubelet[1570]: I0831 22:45:10.098400    1570 scope.go:117] "RemoveContainer" containerID="216b25e04efdd68fa78ff1cfc79456f27ab236602c5e05f800a59fa3cb220480"
	Aug 31 22:45:10 ha-949000 kubelet[1570]: E0831 22:45:10.098514    1570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-provisioner\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-provisioner pod=storage-provisioner_kube-system(03bcdd23-f7f2-45a9-ab95-91918e094226)\"" pod="kube-system/storage-provisioner" podUID="03bcdd23-f7f2-45a9-ab95-91918e094226"
	Aug 31 22:45:14 ha-949000 kubelet[1570]: E0831 22:45:14.356421    1570 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:45:14 ha-949000 kubelet[1570]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:45:14 ha-949000 kubelet[1570]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:45:14 ha-949000 kubelet[1570]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:45:14 ha-949000 kubelet[1570]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:45:23 ha-949000 kubelet[1570]: I0831 22:45:23.340160    1570 scope.go:117] "RemoveContainer" containerID="216b25e04efdd68fa78ff1cfc79456f27ab236602c5e05f800a59fa3cb220480"
	Aug 31 22:46:14 ha-949000 kubelet[1570]: E0831 22:46:14.356754    1570 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:46:14 ha-949000 kubelet[1570]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:46:14 ha-949000 kubelet[1570]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:46:14 ha-949000 kubelet[1570]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:46:14 ha-949000 kubelet[1570]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:47:14 ha-949000 kubelet[1570]: E0831 22:47:14.357076    1570 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:47:14 ha-949000 kubelet[1570]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:47:14 ha-949000 kubelet[1570]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:47:14 ha-949000 kubelet[1570]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:47:14 ha-949000 kubelet[1570]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:48:14 ha-949000 kubelet[1570]: E0831 22:48:14.356617    1570 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:48:14 ha-949000 kubelet[1570]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:48:14 ha-949000 kubelet[1570]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:48:14 ha-949000 kubelet[1570]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:48:14 ha-949000 kubelet[1570]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:255: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p ha-949000 -n ha-949000
helpers_test.go:262: (dbg) Run:  kubectl --context ha-949000 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:273: non-running pods: busybox-7dff88458-g8b59
helpers_test.go:275: ======> post-mortem[TestMultiControlPlane/serial/RestartCluster]: describe non-running pods <======
helpers_test.go:278: (dbg) Run:  kubectl --context ha-949000 describe pod busybox-7dff88458-g8b59
helpers_test.go:283: (dbg) kubectl --context ha-949000 describe pod busybox-7dff88458-g8b59:

                                                
                                                
-- stdout --
	Name:             busybox-7dff88458-g8b59
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             <none>
	Labels:           app=busybox
	                  pod-template-hash=7dff88458
	Annotations:      <none>
	Status:           Pending
	IP:               
	IPs:              <none>
	Controlled By:    ReplicaSet/busybox-7dff88458
	Containers:
	  busybox:
	    Image:      gcr.io/k8s-minikube/busybox:1.28
	    Port:       <none>
	    Host Port:  <none>
	    Command:
	      sleep
	      3600
	    Environment:  <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-jmpb5 (ro)
	Conditions:
	  Type           Status
	  PodScheduled   False 
	Volumes:
	  kube-api-access-jmpb5:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    ConfigMapOptional:       <nil>
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason            Age                    From               Message
	  ----     ------            ----                   ----               -------
	  Warning  FailedScheduling  6m47s                  default-scheduler  0/3 nodes are available: 1 node(s) were unschedulable, 2 node(s) didn't match pod anti-affinity rules. preemption: 0/3 nodes are available: 1 Preemption is not helpful for scheduling, 2 No preemption victims found for incoming pod.
	  Warning  FailedScheduling  6m45s (x2 over 6m47s)  default-scheduler  0/3 nodes are available: 1 node(s) were unschedulable, 2 node(s) didn't match pod anti-affinity rules. preemption: 0/3 nodes are available: 1 Preemption is not helpful for scheduling, 2 No preemption victims found for incoming pod.
	  Warning  FailedScheduling  6m46s (x2 over 6m48s)  default-scheduler  0/3 nodes are available: 1 node(s) were unschedulable, 2 node(s) didn't match pod anti-affinity rules. preemption: 0/3 nodes are available: 1 Preemption is not helpful for scheduling, 2 No preemption victims found for incoming pod.
	  Warning  FailedScheduling  6m46s (x2 over 6m48s)  default-scheduler  0/3 nodes are available: 1 node(s) were unschedulable, 2 node(s) didn't match pod anti-affinity rules. preemption: 0/3 nodes are available: 1 Preemption is not helpful for scheduling, 2 No preemption victims found for incoming pod.
	  Warning  FailedScheduling  4m22s (x3 over 5m7s)   default-scheduler  0/2 nodes are available: 2 node(s) didn't match pod anti-affinity rules. preemption: 0/2 nodes are available: 2 No preemption victims found for incoming pod.
	  Warning  FailedScheduling  4m22s (x3 over 4m42s)  default-scheduler  0/2 nodes are available: 2 node(s) didn't match pod anti-affinity rules. preemption: 0/2 nodes are available: 2 No preemption victims found for incoming pod.

                                                
                                                
-- /stdout --
helpers_test.go:286: <<< TestMultiControlPlane/serial/RestartCluster FAILED: end of post-mortem logs <<<
helpers_test.go:287: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/RestartCluster (377.78s)

                                                
                                    
x
+
TestMultiControlPlane/serial/AddSecondaryNode (79.31s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/AddSecondaryNode
ha_test.go:605: (dbg) Run:  out/minikube-darwin-amd64 node add -p ha-949000 --control-plane -v=7 --alsologtostderr
E0831 15:49:15.432487    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:49:15.783308    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
ha_test.go:605: (dbg) Done: out/minikube-darwin-amd64 node add -p ha-949000 --control-plane -v=7 --alsologtostderr: (1m14.518998814s)
ha_test.go:611: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr
ha_test.go:611: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr: exit status 2 (443.443919ms)

                                                
                                                
-- stdout --
	ha-949000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-949000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	
	ha-949000-m05
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 15:50:28.428507    4239 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:50:28.429334    4239 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:50:28.429342    4239 out.go:358] Setting ErrFile to fd 2...
	I0831 15:50:28.429349    4239 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:50:28.429753    4239 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:50:28.430116    4239 out.go:352] Setting JSON to false
	I0831 15:50:28.430142    4239 mustload.go:65] Loading cluster: ha-949000
	I0831 15:50:28.430179    4239 notify.go:220] Checking for updates...
	I0831 15:50:28.430491    4239 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:50:28.430510    4239 status.go:255] checking status of ha-949000 ...
	I0831 15:50:28.430865    4239 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:50:28.430910    4239 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:50:28.439784    4239 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52210
	I0831 15:50:28.440102    4239 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:50:28.440497    4239 main.go:141] libmachine: Using API Version  1
	I0831 15:50:28.440507    4239 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:50:28.440715    4239 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:50:28.440834    4239 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:50:28.440916    4239 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:50:28.440988    4239 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 4017
	I0831 15:50:28.442044    4239 status.go:330] ha-949000 host status = "Running" (err=<nil>)
	I0831 15:50:28.442068    4239 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:50:28.442318    4239 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:50:28.442363    4239 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:50:28.450862    4239 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52212
	I0831 15:50:28.451194    4239 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:50:28.451517    4239 main.go:141] libmachine: Using API Version  1
	I0831 15:50:28.451530    4239 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:50:28.451758    4239 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:50:28.451859    4239 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:50:28.451946    4239 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:50:28.452198    4239 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:50:28.452222    4239 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:50:28.460744    4239 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52214
	I0831 15:50:28.461082    4239 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:50:28.461435    4239 main.go:141] libmachine: Using API Version  1
	I0831 15:50:28.461456    4239 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:50:28.461665    4239 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:50:28.461766    4239 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:50:28.461923    4239 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:50:28.461943    4239 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:50:28.462034    4239 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:50:28.462122    4239 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:50:28.462205    4239 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:50:28.462285    4239 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:50:28.494759    4239 ssh_runner.go:195] Run: systemctl --version
	I0831 15:50:28.499326    4239 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:50:28.511789    4239 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:50:28.511812    4239 api_server.go:166] Checking apiserver status ...
	I0831 15:50:28.511857    4239 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:50:28.525191    4239 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2367/cgroup
	W0831 15:50:28.533600    4239 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2367/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:50:28.533651    4239 ssh_runner.go:195] Run: ls
	I0831 15:50:28.537151    4239 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:50:28.541740    4239 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:50:28.541755    4239 status.go:422] ha-949000 apiserver status = Running (err=<nil>)
	I0831 15:50:28.541769    4239 status.go:257] ha-949000 status: &{Name:ha-949000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:50:28.541781    4239 status.go:255] checking status of ha-949000-m02 ...
	I0831 15:50:28.542071    4239 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:50:28.542094    4239 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:50:28.551049    4239 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52218
	I0831 15:50:28.551410    4239 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:50:28.551730    4239 main.go:141] libmachine: Using API Version  1
	I0831 15:50:28.551740    4239 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:50:28.551956    4239 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:50:28.552070    4239 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:50:28.552159    4239 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:50:28.552251    4239 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 4035
	I0831 15:50:28.553228    4239 status.go:330] ha-949000-m02 host status = "Running" (err=<nil>)
	I0831 15:50:28.553237    4239 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:50:28.553490    4239 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:50:28.553519    4239 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:50:28.562447    4239 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52220
	I0831 15:50:28.562785    4239 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:50:28.563113    4239 main.go:141] libmachine: Using API Version  1
	I0831 15:50:28.563130    4239 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:50:28.563350    4239 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:50:28.563461    4239 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:50:28.563572    4239 host.go:66] Checking if "ha-949000-m02" exists ...
	I0831 15:50:28.563846    4239 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:50:28.563877    4239 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:50:28.572952    4239 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52222
	I0831 15:50:28.573308    4239 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:50:28.573653    4239 main.go:141] libmachine: Using API Version  1
	I0831 15:50:28.573664    4239 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:50:28.573881    4239 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:50:28.573990    4239 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:50:28.574134    4239 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:50:28.574145    4239 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:50:28.574216    4239 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:50:28.574308    4239 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:50:28.574373    4239 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:50:28.574447    4239 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:50:28.603275    4239 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:50:28.616204    4239 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:50:28.616219    4239 api_server.go:166] Checking apiserver status ...
	I0831 15:50:28.616259    4239 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:50:28.628039    4239 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2034/cgroup
	W0831 15:50:28.635260    4239 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2034/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:50:28.635314    4239 ssh_runner.go:195] Run: ls
	I0831 15:50:28.638608    4239 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:50:28.641697    4239 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:50:28.641709    4239 status.go:422] ha-949000-m02 apiserver status = Running (err=<nil>)
	I0831 15:50:28.641722    4239 status.go:257] ha-949000-m02 status: &{Name:ha-949000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:50:28.641733    4239 status.go:255] checking status of ha-949000-m04 ...
	I0831 15:50:28.642008    4239 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:50:28.642034    4239 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:50:28.650707    4239 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52226
	I0831 15:50:28.651060    4239 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:50:28.651415    4239 main.go:141] libmachine: Using API Version  1
	I0831 15:50:28.651439    4239 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:50:28.651642    4239 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:50:28.651741    4239 main.go:141] libmachine: (ha-949000-m04) Calling .GetState
	I0831 15:50:28.651821    4239 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:50:28.651910    4239 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 4071
	I0831 15:50:28.652897    4239 status.go:330] ha-949000-m04 host status = "Running" (err=<nil>)
	I0831 15:50:28.652908    4239 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:50:28.653163    4239 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:50:28.653193    4239 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:50:28.661917    4239 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52228
	I0831 15:50:28.662267    4239 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:50:28.662615    4239 main.go:141] libmachine: Using API Version  1
	I0831 15:50:28.662623    4239 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:50:28.662844    4239 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:50:28.662959    4239 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:50:28.663044    4239 host.go:66] Checking if "ha-949000-m04" exists ...
	I0831 15:50:28.663336    4239 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:50:28.663361    4239 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:50:28.672765    4239 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52230
	I0831 15:50:28.673115    4239 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:50:28.673478    4239 main.go:141] libmachine: Using API Version  1
	I0831 15:50:28.673497    4239 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:50:28.673703    4239 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:50:28.673808    4239 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:50:28.673943    4239 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:50:28.673962    4239 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:50:28.674032    4239 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:50:28.674131    4239 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:50:28.674211    4239 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:50:28.674293    4239 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:50:28.704670    4239 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:50:28.715817    4239 status.go:257] ha-949000-m04 status: &{Name:ha-949000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:50:28.715842    4239 status.go:255] checking status of ha-949000-m05 ...
	I0831 15:50:28.716144    4239 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:50:28.716167    4239 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:50:28.724880    4239 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52233
	I0831 15:50:28.725221    4239 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:50:28.725574    4239 main.go:141] libmachine: Using API Version  1
	I0831 15:50:28.725590    4239 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:50:28.725802    4239 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:50:28.725938    4239 main.go:141] libmachine: (ha-949000-m05) Calling .GetState
	I0831 15:50:28.726029    4239 main.go:141] libmachine: (ha-949000-m05) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:50:28.726114    4239 main.go:141] libmachine: (ha-949000-m05) DBG | hyperkit pid from json: 4210
	I0831 15:50:28.727121    4239 status.go:330] ha-949000-m05 host status = "Running" (err=<nil>)
	I0831 15:50:28.727130    4239 host.go:66] Checking if "ha-949000-m05" exists ...
	I0831 15:50:28.727376    4239 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:50:28.727406    4239 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:50:28.735920    4239 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52235
	I0831 15:50:28.736254    4239 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:50:28.736588    4239 main.go:141] libmachine: Using API Version  1
	I0831 15:50:28.736608    4239 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:50:28.736798    4239 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:50:28.736902    4239 main.go:141] libmachine: (ha-949000-m05) Calling .GetIP
	I0831 15:50:28.736984    4239 host.go:66] Checking if "ha-949000-m05" exists ...
	I0831 15:50:28.737248    4239 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:50:28.737268    4239 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:50:28.745833    4239 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52237
	I0831 15:50:28.746200    4239 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:50:28.746559    4239 main.go:141] libmachine: Using API Version  1
	I0831 15:50:28.746578    4239 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:50:28.746782    4239 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:50:28.746889    4239 main.go:141] libmachine: (ha-949000-m05) Calling .DriverName
	I0831 15:50:28.747021    4239 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:50:28.747034    4239 main.go:141] libmachine: (ha-949000-m05) Calling .GetSSHHostname
	I0831 15:50:28.747113    4239 main.go:141] libmachine: (ha-949000-m05) Calling .GetSSHPort
	I0831 15:50:28.747195    4239 main.go:141] libmachine: (ha-949000-m05) Calling .GetSSHKeyPath
	I0831 15:50:28.747267    4239 main.go:141] libmachine: (ha-949000-m05) Calling .GetSSHUsername
	I0831 15:50:28.747345    4239 sshutil.go:53] new ssh client: &{IP:192.169.0.9 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m05/id_rsa Username:docker}
	I0831 15:50:28.780432    4239 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:50:28.791030    4239 kubeconfig.go:125] found "ha-949000" server: "https://192.169.0.254:8443"
	I0831 15:50:28.791045    4239 api_server.go:166] Checking apiserver status ...
	I0831 15:50:28.791081    4239 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:50:28.801999    4239 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1777/cgroup
	W0831 15:50:28.809365    4239 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1777/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:50:28.809419    4239 ssh_runner.go:195] Run: ls
	I0831 15:50:28.812640    4239 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0831 15:50:28.815806    4239 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0831 15:50:28.815817    4239 status.go:422] ha-949000-m05 apiserver status = Running (err=<nil>)
	I0831 15:50:28.815825    4239 status.go:257] ha-949000-m05 status: &{Name:ha-949000-m05 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:613: failed to run minikube status. args "out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr" : exit status 2
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-949000 -n ha-949000
helpers_test.go:245: <<< TestMultiControlPlane/serial/AddSecondaryNode FAILED: start of post-mortem logs <<<
helpers_test.go:246: ======>  post-mortem[TestMultiControlPlane/serial/AddSecondaryNode]: minikube logs <======
helpers_test.go:248: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 logs -n 25
helpers_test.go:248: (dbg) Done: out/minikube-darwin-amd64 -p ha-949000 logs -n 25: (3.584729102s)
helpers_test.go:253: TestMultiControlPlane/serial/AddSecondaryNode logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5 -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- get pods -o          | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-5kkbw -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-6r9s5 -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-949000 -- exec                 | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT | 31 Aug 24 15:32 PDT |
	|         | busybox-7dff88458-vjf9x -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| node    | add -p ha-949000 -v=7                | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:32 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-949000 node stop m02 -v=7         | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:33 PDT | 31 Aug 24 15:33 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-949000 node start m02 -v=7        | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:34 PDT | 31 Aug 24 15:34 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-949000 -v=7               | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:35 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| stop    | -p ha-949000 -v=7                    | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:35 PDT | 31 Aug 24 15:36 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| start   | -p ha-949000 --wait=true -v=7        | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:36 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-949000                    | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:42 PDT |                     |
	| node    | ha-949000 node delete m03 -v=7       | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:42 PDT | 31 Aug 24 15:42 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| stop    | ha-949000 stop -v=7                  | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:42 PDT | 31 Aug 24 15:42 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| start   | -p ha-949000 --wait=true             | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:42 PDT |                     |
	|         | -v=7 --alsologtostderr               |           |         |         |                     |                     |
	|         | --driver=hyperkit                    |           |         |         |                     |                     |
	| node    | add -p ha-949000                     | ha-949000 | jenkins | v1.33.1 | 31 Aug 24 15:49 PDT | 31 Aug 24 15:50 PDT |
	|         | --control-plane -v=7                 |           |         |         |                     |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/08/31 15:42:55
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0831 15:42:55.897896    4003 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:42:55.898177    4003 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:42:55.898183    4003 out.go:358] Setting ErrFile to fd 2...
	I0831 15:42:55.898187    4003 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:42:55.898378    4003 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:42:55.899837    4003 out.go:352] Setting JSON to false
	I0831 15:42:55.921901    4003 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":2546,"bootTime":1725141629,"procs":434,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.6.1","kernelVersion":"23.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0831 15:42:55.922001    4003 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0831 15:42:55.944577    4003 out.go:177] * [ha-949000] minikube v1.33.1 on Darwin 14.6.1
	I0831 15:42:55.987096    4003 out.go:177]   - MINIKUBE_LOCATION=18943
	I0831 15:42:55.987175    4003 notify.go:220] Checking for updates...
	I0831 15:42:56.029932    4003 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:42:56.050856    4003 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0831 15:42:56.072033    4003 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0831 15:42:56.093103    4003 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:42:56.114053    4003 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0831 15:42:56.135758    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:42:56.136428    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:56.136520    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:42:56.146197    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52047
	I0831 15:42:56.146589    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:42:56.146991    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:42:56.147003    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:42:56.147207    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:42:56.147336    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:42:56.147526    4003 driver.go:392] Setting default libvirt URI to qemu:///system
	I0831 15:42:56.147753    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:56.147780    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:42:56.156287    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52049
	I0831 15:42:56.156619    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:42:56.156971    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:42:56.156990    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:42:56.157191    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:42:56.157316    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:42:56.186031    4003 out.go:177] * Using the hyperkit driver based on existing profile
	I0831 15:42:56.227918    4003 start.go:297] selected driver: hyperkit
	I0831 15:42:56.227945    4003 start.go:901] validating driver "hyperkit" against &{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime: ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false
ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirro
r: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:42:56.228199    4003 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0831 15:42:56.228401    4003 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:42:56.228599    4003 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18943-957/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0831 15:42:56.238336    4003 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0831 15:42:56.242056    4003 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:56.242078    4003 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0831 15:42:56.244705    4003 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:42:56.244747    4003 cni.go:84] Creating CNI manager for ""
	I0831 15:42:56.244753    4003 cni.go:136] multinode detected (3 nodes found), recommending kindnet
	I0831 15:42:56.244827    4003 start.go:340] cluster config:
	{Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.16
9.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false
kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: S
ocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:42:56.244921    4003 iso.go:125] acquiring lock: {Name:mk6e91575b208577856769ef01f8e000bc57c787 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:42:56.286816    4003 out.go:177] * Starting "ha-949000" primary control-plane node in "ha-949000" cluster
	I0831 15:42:56.307847    4003 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:42:56.307937    4003 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4
	I0831 15:42:56.307963    4003 cache.go:56] Caching tarball of preloaded images
	I0831 15:42:56.308209    4003 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:42:56.308229    4003 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:42:56.308418    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:42:56.309323    4003 start.go:360] acquireMachinesLock for ha-949000: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:42:56.309437    4003 start.go:364] duration metric: took 90.572µs to acquireMachinesLock for "ha-949000"
	I0831 15:42:56.309468    4003 start.go:96] Skipping create...Using existing machine configuration
	I0831 15:42:56.309488    4003 fix.go:54] fixHost starting: 
	I0831 15:42:56.309922    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:56.309949    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:42:56.318888    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52051
	I0831 15:42:56.319241    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:42:56.319612    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:42:56.319626    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:42:56.319866    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:42:56.320016    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:42:56.320133    4003 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:42:56.320226    4003 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:42:56.320300    4003 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 3756
	I0831 15:42:56.321264    4003 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid 3756 missing from process table
	I0831 15:42:56.321288    4003 fix.go:112] recreateIfNeeded on ha-949000: state=Stopped err=<nil>
	I0831 15:42:56.321305    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	W0831 15:42:56.321391    4003 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 15:42:56.363717    4003 out.go:177] * Restarting existing hyperkit VM for "ha-949000" ...
	I0831 15:42:56.384899    4003 main.go:141] libmachine: (ha-949000) Calling .Start
	I0831 15:42:56.385294    4003 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:42:56.385370    4003 main.go:141] libmachine: (ha-949000) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid
	I0831 15:42:56.387089    4003 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid 3756 missing from process table
	I0831 15:42:56.387099    4003 main.go:141] libmachine: (ha-949000) DBG | pid 3756 is in state "Stopped"
	I0831 15:42:56.387115    4003 main.go:141] libmachine: (ha-949000) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid...
	I0831 15:42:56.387550    4003 main.go:141] libmachine: (ha-949000) DBG | Using UUID 98cab9ba-901d-49d1-9e6c-321a4533d56e
	I0831 15:42:56.496381    4003 main.go:141] libmachine: (ha-949000) DBG | Generated MAC ce:8:77:f7:42:5e
	I0831 15:42:56.496404    4003 main.go:141] libmachine: (ha-949000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:42:56.496533    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"98cab9ba-901d-49d1-9e6c-321a4533d56e", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003834d0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:42:56.496559    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"98cab9ba-901d-49d1-9e6c-321a4533d56e", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003834d0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:42:56.496621    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "98cab9ba-901d-49d1-9e6c-321a4533d56e", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd,earlyprintk=serial l
oglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:42:56.496665    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 98cab9ba-901d-49d1-9e6c-321a4533d56e -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/ha-949000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset noresto
re waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:42:56.496684    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:42:56.498385    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 DEBUG: hyperkit: Pid is 4017
	I0831 15:42:56.498816    4003 main.go:141] libmachine: (ha-949000) DBG | Attempt 0
	I0831 15:42:56.498834    4003 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:42:56.498897    4003 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 4017
	I0831 15:42:56.500466    4003 main.go:141] libmachine: (ha-949000) DBG | Searching for ce:8:77:f7:42:5e in /var/db/dhcpd_leases ...
	I0831 15:42:56.500539    4003 main.go:141] libmachine: (ha-949000) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0831 15:42:56.500570    4003 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39c5e}
	I0831 15:42:56.500583    4003 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 15:42:56.500598    4003 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ec75}
	I0831 15:42:56.500613    4003 main.go:141] libmachine: (ha-949000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4ec63}
	I0831 15:42:56.500643    4003 main.go:141] libmachine: (ha-949000) DBG | Found match: ce:8:77:f7:42:5e
	I0831 15:42:56.500654    4003 main.go:141] libmachine: (ha-949000) DBG | IP: 192.169.0.5
	I0831 15:42:56.500687    4003 main.go:141] libmachine: (ha-949000) Calling .GetConfigRaw
	I0831 15:42:56.501361    4003 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:42:56.501546    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:42:56.501931    4003 machine.go:93] provisionDockerMachine start ...
	I0831 15:42:56.501942    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:42:56.502103    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:42:56.502225    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:42:56.502347    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:42:56.502457    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:42:56.502550    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:42:56.502680    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:42:56.502894    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:42:56.502905    4003 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 15:42:56.506309    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:42:56.558516    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:42:56.559184    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:42:56.559207    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:42:56.559278    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:42:56.559308    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:42:56.940245    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:42:56.940260    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:56 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:42:57.055064    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:42:57.055080    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:42:57.055092    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:42:57.055101    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:42:57.056061    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:57 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:42:57.056073    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:42:57 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:43:02.655390    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:43:02 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0831 15:43:02.655429    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:43:02 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0831 15:43:02.655438    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:43:02 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0831 15:43:02.679403    4003 main.go:141] libmachine: (ha-949000) DBG | 2024/08/31 15:43:02 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0831 15:43:07.568442    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 15:43:07.568456    4003 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:43:07.568651    4003 buildroot.go:166] provisioning hostname "ha-949000"
	I0831 15:43:07.568662    4003 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:43:07.568760    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:07.568847    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:07.568962    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:07.569093    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:07.569187    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:07.569365    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:07.569534    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:43:07.569549    4003 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000 && echo "ha-949000" | sudo tee /etc/hostname
	I0831 15:43:07.639291    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000
	
	I0831 15:43:07.639309    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:07.639436    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:07.639557    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:07.639638    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:07.639737    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:07.639874    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:07.640074    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:43:07.640086    4003 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:43:07.704134    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:43:07.704155    4003 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:43:07.704172    4003 buildroot.go:174] setting up certificates
	I0831 15:43:07.704178    4003 provision.go:84] configureAuth start
	I0831 15:43:07.704186    4003 main.go:141] libmachine: (ha-949000) Calling .GetMachineName
	I0831 15:43:07.704317    4003 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:43:07.704420    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:07.704522    4003 provision.go:143] copyHostCerts
	I0831 15:43:07.704550    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:43:07.704624    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:43:07.704632    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:43:07.704768    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:43:07.704971    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:43:07.705012    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:43:07.705016    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:43:07.705108    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:43:07.705254    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:43:07.705294    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:43:07.705299    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:43:07.705382    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:43:07.705569    4003 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000 san=[127.0.0.1 192.169.0.5 ha-949000 localhost minikube]
	I0831 15:43:07.906186    4003 provision.go:177] copyRemoteCerts
	I0831 15:43:07.906273    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:43:07.906312    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:07.906550    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:07.906738    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:07.906937    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:07.907046    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:43:07.944033    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:43:07.944107    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:43:07.963419    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:43:07.963482    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0831 15:43:07.982821    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:43:07.982884    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0831 15:43:08.001703    4003 provision.go:87] duration metric: took 297.505228ms to configureAuth
	I0831 15:43:08.001714    4003 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:43:08.001892    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:43:08.001909    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:08.002046    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:08.002137    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:08.002225    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:08.002306    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:08.002382    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:08.002501    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:08.002650    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:43:08.002659    4003 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:43:08.059324    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:43:08.059336    4003 buildroot.go:70] root file system type: tmpfs
	I0831 15:43:08.059403    4003 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:43:08.059416    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:08.059551    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:08.059659    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:08.059758    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:08.059843    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:08.059967    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:08.060104    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:43:08.060148    4003 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:43:08.127622    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:43:08.127643    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:08.127795    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:08.127885    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:08.127986    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:08.128093    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:08.128219    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:08.128373    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:43:08.128385    4003 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:43:09.818482    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:43:09.818495    4003 machine.go:96] duration metric: took 13.316412951s to provisionDockerMachine
	I0831 15:43:09.818507    4003 start.go:293] postStartSetup for "ha-949000" (driver="hyperkit")
	I0831 15:43:09.818514    4003 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:43:09.818524    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:09.818708    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:43:09.818733    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:09.818845    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:09.818952    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:09.819031    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:09.819124    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:43:09.856201    4003 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:43:09.861552    4003 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:43:09.861568    4003 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:43:09.861690    4003 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:43:09.861873    4003 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:43:09.861880    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:43:09.862086    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:43:09.873444    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:43:09.903949    4003 start.go:296] duration metric: took 85.422286ms for postStartSetup
	I0831 15:43:09.903973    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:09.904145    4003 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0831 15:43:09.904158    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:09.904244    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:09.904332    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:09.904406    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:09.904491    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:43:09.939732    4003 machine.go:197] restoring vm config from /var/lib/minikube/backup: [etc]
	I0831 15:43:09.939783    4003 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0831 15:43:09.973207    4003 fix.go:56] duration metric: took 13.663579156s for fixHost
	I0831 15:43:09.973228    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:09.973356    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:09.973449    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:09.973546    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:09.973619    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:09.973749    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:09.973922    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0831 15:43:09.973930    4003 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:43:10.027714    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725144190.095289778
	
	I0831 15:43:10.027726    4003 fix.go:216] guest clock: 1725144190.095289778
	I0831 15:43:10.027732    4003 fix.go:229] Guest: 2024-08-31 15:43:10.095289778 -0700 PDT Remote: 2024-08-31 15:43:09.973219 -0700 PDT m=+14.110517944 (delta=122.070778ms)
	I0831 15:43:10.027767    4003 fix.go:200] guest clock delta is within tolerance: 122.070778ms
	I0831 15:43:10.027774    4003 start.go:83] releasing machines lock for "ha-949000", held for 13.718178323s
	I0831 15:43:10.027798    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:10.027932    4003 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:43:10.028026    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:10.028324    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:10.028419    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:10.028500    4003 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:43:10.028533    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:10.028579    4003 ssh_runner.go:195] Run: cat /version.json
	I0831 15:43:10.028591    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:10.028629    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:10.028705    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:10.028719    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:10.028882    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:10.028892    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:10.028975    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:10.028990    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:43:10.029049    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:43:10.106178    4003 ssh_runner.go:195] Run: systemctl --version
	I0831 15:43:10.111111    4003 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0831 15:43:10.115308    4003 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:43:10.115344    4003 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:43:10.127805    4003 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:43:10.127827    4003 start.go:495] detecting cgroup driver to use...
	I0831 15:43:10.127920    4003 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:43:10.145626    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:43:10.154624    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:43:10.163250    4003 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:43:10.163290    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:43:10.172090    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:43:10.180802    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:43:10.189726    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:43:10.198477    4003 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:43:10.207531    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:43:10.216228    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:43:10.224957    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:43:10.233724    4003 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:43:10.241776    4003 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:43:10.249895    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:10.347162    4003 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:43:10.365744    4003 start.go:495] detecting cgroup driver to use...
	I0831 15:43:10.365818    4003 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:43:10.378577    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:43:10.391840    4003 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:43:10.407333    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:43:10.418578    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:43:10.428427    4003 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:43:10.447942    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:43:10.460400    4003 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:43:10.475459    4003 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:43:10.478281    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:43:10.485396    4003 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:43:10.498761    4003 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:43:10.593460    4003 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:43:10.696411    4003 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:43:10.696483    4003 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:43:10.710317    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:10.803031    4003 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:43:13.157366    4003 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.354290436s)
	I0831 15:43:13.157446    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:43:13.167970    4003 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 15:43:13.180929    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:43:13.191096    4003 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:43:13.293424    4003 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:43:13.392743    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:13.483508    4003 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:43:13.497374    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:43:13.508419    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:13.608347    4003 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:43:13.667376    4003 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:43:13.667470    4003 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:43:13.671956    4003 start.go:563] Will wait 60s for crictl version
	I0831 15:43:13.672004    4003 ssh_runner.go:195] Run: which crictl
	I0831 15:43:13.675617    4003 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:43:13.702050    4003 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:43:13.702122    4003 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:43:13.720302    4003 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:43:13.762901    4003 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:43:13.762952    4003 main.go:141] libmachine: (ha-949000) Calling .GetIP
	I0831 15:43:13.763326    4003 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:43:13.768068    4003 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:43:13.778798    4003 kubeadm.go:883] updating cluster {Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.
0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:f
alse inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOpt
imizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0831 15:43:13.778877    4003 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:43:13.778928    4003 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 15:43:13.792562    4003 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240813-c6f155d6
	registry.k8s.io/kube-apiserver:v1.31.0
	registry.k8s.io/kube-controller-manager:v1.31.0
	registry.k8s.io/kube-scheduler:v1.31.0
	registry.k8s.io/kube-proxy:v1.31.0
	registry.k8s.io/etcd:3.5.15-0
	registry.k8s.io/pause:3.10
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/coredns/coredns:v1.11.1
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0831 15:43:13.792576    4003 docker.go:615] Images already preloaded, skipping extraction
	I0831 15:43:13.792671    4003 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 15:43:13.806816    4003 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240813-c6f155d6
	registry.k8s.io/kube-controller-manager:v1.31.0
	registry.k8s.io/kube-apiserver:v1.31.0
	registry.k8s.io/kube-scheduler:v1.31.0
	registry.k8s.io/kube-proxy:v1.31.0
	registry.k8s.io/etcd:3.5.15-0
	registry.k8s.io/pause:3.10
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/coredns/coredns:v1.11.1
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0831 15:43:13.806831    4003 cache_images.go:84] Images are preloaded, skipping loading
	I0831 15:43:13.806842    4003 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.31.0 docker true true} ...
	I0831 15:43:13.806921    4003 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:43:13.806997    4003 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0831 15:43:13.845829    4003 cni.go:84] Creating CNI manager for ""
	I0831 15:43:13.845843    4003 cni.go:136] multinode detected (3 nodes found), recommending kindnet
	I0831 15:43:13.845854    4003 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0831 15:43:13.845869    4003 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.31.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-949000 NodeName:ha-949000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0831 15:43:13.845940    4003 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-949000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.31.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0831 15:43:13.845960    4003 kube-vip.go:115] generating kube-vip config ...
	I0831 15:43:13.846014    4003 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:43:13.859390    4003 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:43:13.859457    4003 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:43:13.859510    4003 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:43:13.867760    4003 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:43:13.867806    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0831 15:43:13.876386    4003 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0831 15:43:13.889628    4003 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:43:13.903120    4003 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0831 15:43:13.916765    4003 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:43:13.930236    4003 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:43:13.933264    4003 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:43:13.943217    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:14.038507    4003 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:43:14.052829    4003 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.5
	I0831 15:43:14.052841    4003 certs.go:194] generating shared ca certs ...
	I0831 15:43:14.052850    4003 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:43:14.053024    4003 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:43:14.053101    4003 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:43:14.053114    4003 certs.go:256] generating profile certs ...
	I0831 15:43:14.053197    4003 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:43:14.053222    4003 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.43b6ffe0
	I0831 15:43:14.053237    4003 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.43b6ffe0 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.254]
	I0831 15:43:14.128581    4003 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.43b6ffe0 ...
	I0831 15:43:14.128599    4003 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.43b6ffe0: {Name:mk00e438b52db2444ba8ce93d114dacf50fb7384 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:43:14.129258    4003 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.43b6ffe0 ...
	I0831 15:43:14.129272    4003 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.43b6ffe0: {Name:mkd10daf9fa17e10453b3bbf65f5132bb9bcd577 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:43:14.129503    4003 certs.go:381] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt.43b6ffe0 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt
	I0831 15:43:14.129738    4003 certs.go:385] copying /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.43b6ffe0 -> /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key
	I0831 15:43:14.129977    4003 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:43:14.129987    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:43:14.130020    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:43:14.130040    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:43:14.130058    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:43:14.130075    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:43:14.130093    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:43:14.130110    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:43:14.130128    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:43:14.130233    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:43:14.130284    4003 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:43:14.130292    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:43:14.130322    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:43:14.130355    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:43:14.130384    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:43:14.130447    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:43:14.130483    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:14.130504    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:43:14.130522    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:43:14.131005    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:43:14.153234    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:43:14.186923    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:43:14.229049    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:43:14.284589    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1432 bytes)
	I0831 15:43:14.334141    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0831 15:43:14.385269    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:43:14.429545    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:43:14.461048    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:43:14.494719    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:43:14.523624    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:43:14.557563    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0831 15:43:14.571298    4003 ssh_runner.go:195] Run: openssl version
	I0831 15:43:14.575654    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:43:14.584028    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:43:14.587453    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:43:14.587495    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:43:14.591803    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:43:14.600098    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:43:14.608239    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:43:14.611660    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:43:14.611694    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:43:14.615930    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:43:14.624111    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:43:14.632509    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:14.636012    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:14.636052    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:14.640278    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:43:14.648758    4003 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:43:14.652057    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0831 15:43:14.656418    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0831 15:43:14.660743    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0831 15:43:14.665063    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0831 15:43:14.669321    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0831 15:43:14.673568    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0831 15:43:14.677784    4003 kubeadm.go:392] StartCluster: {Name:ha-949000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 C
lusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:fals
e inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimi
zations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:43:14.677912    4003 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0831 15:43:14.690883    4003 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0831 15:43:14.698384    4003 kubeadm.go:408] found existing configuration files, will attempt cluster restart
	I0831 15:43:14.698396    4003 kubeadm.go:593] restartPrimaryControlPlane start ...
	I0831 15:43:14.698441    4003 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0831 15:43:14.706022    4003 kubeadm.go:130] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:43:14.706313    4003 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-949000" does not appear in /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:43:14.706401    4003 kubeconfig.go:62] /Users/jenkins/minikube-integration/18943-957/kubeconfig needs updating (will repair): [kubeconfig missing "ha-949000" cluster setting kubeconfig missing "ha-949000" context setting]
	I0831 15:43:14.706628    4003 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/kubeconfig: {Name:mkc7259a3f17d77b84078e55eed4ed8b5d2486ca Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:43:14.707280    4003 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:43:14.707484    4003 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.5:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, Use
rAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52edc00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0831 15:43:14.707808    4003 cert_rotation.go:140] Starting client certificate rotation controller
	I0831 15:43:14.707985    4003 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0831 15:43:14.715222    4003 kubeadm.go:630] The running cluster does not require reconfiguration: 192.169.0.5
	I0831 15:43:14.715234    4003 kubeadm.go:597] duration metric: took 16.834195ms to restartPrimaryControlPlane
	I0831 15:43:14.715240    4003 kubeadm.go:394] duration metric: took 37.459181ms to StartCluster
	I0831 15:43:14.715249    4003 settings.go:142] acquiring lock: {Name:mk4b1b0a7439feab82be8f6d66b4d3c4d11c9b5f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:43:14.715327    4003 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:43:14.715694    4003 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/kubeconfig: {Name:mkc7259a3f17d77b84078e55eed4ed8b5d2486ca Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:43:14.715917    4003 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:43:14.715930    4003 start.go:241] waiting for startup goroutines ...
	I0831 15:43:14.715938    4003 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0831 15:43:14.716058    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:43:14.761177    4003 out.go:177] * Enabled addons: 
	I0831 15:43:14.783218    4003 addons.go:510] duration metric: took 67.285233ms for enable addons: enabled=[]
	I0831 15:43:14.783269    4003 start.go:246] waiting for cluster config update ...
	I0831 15:43:14.783281    4003 start.go:255] writing updated cluster config ...
	I0831 15:43:14.806130    4003 out.go:201] 
	I0831 15:43:14.827581    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:43:14.827719    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:43:14.850202    4003 out.go:177] * Starting "ha-949000-m02" control-plane node in "ha-949000" cluster
	I0831 15:43:14.892085    4003 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:43:14.892153    4003 cache.go:56] Caching tarball of preloaded images
	I0831 15:43:14.892329    4003 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:43:14.892347    4003 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:43:14.892479    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:43:14.893510    4003 start.go:360] acquireMachinesLock for ha-949000-m02: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:43:14.893615    4003 start.go:364] duration metric: took 79.031µs to acquireMachinesLock for "ha-949000-m02"
	I0831 15:43:14.893640    4003 start.go:96] Skipping create...Using existing machine configuration
	I0831 15:43:14.893648    4003 fix.go:54] fixHost starting: m02
	I0831 15:43:14.894056    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:43:14.894083    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:43:14.903465    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52073
	I0831 15:43:14.903886    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:43:14.904288    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:43:14.904300    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:43:14.904593    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:43:14.904763    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:14.904931    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:43:14.905038    4003 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:43:14.905115    4003 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 3763
	I0831 15:43:14.906087    4003 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid 3763 missing from process table
	I0831 15:43:14.906133    4003 fix.go:112] recreateIfNeeded on ha-949000-m02: state=Stopped err=<nil>
	I0831 15:43:14.906161    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	W0831 15:43:14.906324    4003 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 15:43:14.949174    4003 out.go:177] * Restarting existing hyperkit VM for "ha-949000-m02" ...
	I0831 15:43:14.970157    4003 main.go:141] libmachine: (ha-949000-m02) Calling .Start
	I0831 15:43:14.970435    4003 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:43:14.970489    4003 main.go:141] libmachine: (ha-949000-m02) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid
	I0831 15:43:14.972233    4003 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid 3763 missing from process table
	I0831 15:43:14.972246    4003 main.go:141] libmachine: (ha-949000-m02) DBG | pid 3763 is in state "Stopped"
	I0831 15:43:14.972295    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid...
	I0831 15:43:14.972683    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Using UUID 23e5d675-5201-4f3d-86b7-b25c818528d1
	I0831 15:43:14.998998    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Generated MAC 92:7:3c:3f:ee:b7
	I0831 15:43:14.999027    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:43:14.999117    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:14 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"23e5d675-5201-4f3d-86b7-b25c818528d1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003bea80)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:43:14.999143    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:14 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"23e5d675-5201-4f3d-86b7-b25c818528d1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003bea80)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:43:14.999177    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:14 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "23e5d675-5201-4f3d-86b7-b25c818528d1", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:43:14.999231    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:14 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 23e5d675-5201-4f3d-86b7-b25c818528d1 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/ha-949000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:43:14.999254    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:14 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:43:15.000658    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 DEBUG: hyperkit: Pid is 4035
	I0831 15:43:15.001119    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Attempt 0
	I0831 15:43:15.001129    4003 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:43:15.001211    4003 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 4035
	I0831 15:43:15.003022    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Searching for 92:7:3c:3f:ee:b7 in /var/db/dhcpd_leases ...
	I0831 15:43:15.003110    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0831 15:43:15.003135    4003 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 15:43:15.003157    4003 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39c5e}
	I0831 15:43:15.003193    4003 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:fa:59:9e:3b:35:6d ID:1,fa:59:9e:3b:35:6d Lease:0x66d39c56}
	I0831 15:43:15.003213    4003 main.go:141] libmachine: (ha-949000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ec75}
	I0831 15:43:15.003221    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetConfigRaw
	I0831 15:43:15.003228    4003 main.go:141] libmachine: (ha-949000-m02) DBG | Found match: 92:7:3c:3f:ee:b7
	I0831 15:43:15.003263    4003 main.go:141] libmachine: (ha-949000-m02) DBG | IP: 192.169.0.6
	I0831 15:43:15.003898    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:43:15.004131    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:43:15.004587    4003 machine.go:93] provisionDockerMachine start ...
	I0831 15:43:15.004598    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:15.004713    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:15.004819    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:15.004915    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:15.005012    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:15.005089    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:15.005222    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:15.005366    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:43:15.005373    4003 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 15:43:15.008900    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:43:15.017748    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:43:15.018656    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:43:15.018679    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:43:15.018711    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:43:15.018731    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:43:15.399794    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:43:15.399810    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:43:15.514263    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:43:15.514282    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:43:15.514290    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:43:15.514296    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:43:15.515095    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:43:15.515105    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:15 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:43:21.084857    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:21 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:43:21.085024    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:21 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:43:21.085033    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:21 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:43:21.108855    4003 main.go:141] libmachine: (ha-949000-m02) DBG | 2024/08/31 15:43:21 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:43:50.068778    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 15:43:50.068792    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:43:50.068914    4003 buildroot.go:166] provisioning hostname "ha-949000-m02"
	I0831 15:43:50.068926    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:43:50.069013    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:50.069099    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:50.069176    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.069263    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.069336    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:50.069474    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:50.069630    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:43:50.069640    4003 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m02 && echo "ha-949000-m02" | sudo tee /etc/hostname
	I0831 15:43:50.130987    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m02
	
	I0831 15:43:50.131001    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:50.131142    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:50.131239    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.131330    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.131429    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:50.131565    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:50.131704    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:43:50.131716    4003 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:43:50.189171    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:43:50.189186    4003 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:43:50.189202    4003 buildroot.go:174] setting up certificates
	I0831 15:43:50.189208    4003 provision.go:84] configureAuth start
	I0831 15:43:50.189215    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetMachineName
	I0831 15:43:50.189354    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:43:50.189440    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:50.189529    4003 provision.go:143] copyHostCerts
	I0831 15:43:50.189563    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:43:50.189610    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:43:50.189616    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:43:50.189739    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:43:50.189940    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:43:50.189969    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:43:50.189973    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:43:50.190084    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:43:50.190251    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:43:50.190286    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:43:50.190291    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:43:50.190364    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:43:50.190554    4003 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m02 san=[127.0.0.1 192.169.0.6 ha-949000-m02 localhost minikube]
	I0831 15:43:50.447994    4003 provision.go:177] copyRemoteCerts
	I0831 15:43:50.448048    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:43:50.448062    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:50.448197    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:50.448289    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.448376    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:50.448469    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:43:50.481386    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:43:50.481457    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:43:50.500479    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:43:50.500539    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:43:50.519580    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:43:50.519638    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0831 15:43:50.538582    4003 provision.go:87] duration metric: took 349.361412ms to configureAuth
	I0831 15:43:50.538594    4003 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:43:50.538767    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:43:50.538781    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:50.538915    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:50.539010    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:50.539090    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.539170    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.539253    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:50.539350    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:50.539469    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:43:50.539477    4003 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:43:50.589461    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:43:50.589472    4003 buildroot.go:70] root file system type: tmpfs
	I0831 15:43:50.589565    4003 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:43:50.589575    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:50.589709    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:50.589808    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.589904    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.589997    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:50.590114    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:50.590247    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:43:50.590295    4003 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:43:50.650656    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:43:50.650675    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:50.650817    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:50.650904    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.650975    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:50.651066    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:50.651189    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:50.651328    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:43:50.651340    4003 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:43:52.319769    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:43:52.319783    4003 machine.go:96] duration metric: took 37.314787706s to provisionDockerMachine
	I0831 15:43:52.319791    4003 start.go:293] postStartSetup for "ha-949000-m02" (driver="hyperkit")
	I0831 15:43:52.319799    4003 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:43:52.319809    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:52.319999    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:43:52.320012    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:52.320113    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:52.320206    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:52.320293    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:52.320379    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:43:52.352031    4003 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:43:52.355233    4003 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:43:52.355244    4003 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:43:52.355330    4003 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:43:52.355466    4003 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:43:52.355473    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:43:52.355627    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:43:52.362886    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:43:52.382898    4003 start.go:296] duration metric: took 63.098255ms for postStartSetup
	I0831 15:43:52.382918    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:52.383098    4003 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0831 15:43:52.383110    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:52.383181    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:52.383271    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:52.383354    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:52.383436    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:43:52.415810    4003 machine.go:197] restoring vm config from /var/lib/minikube/backup: [etc]
	I0831 15:43:52.415864    4003 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0831 15:43:52.449230    4003 fix.go:56] duration metric: took 37.555176154s for fixHost
	I0831 15:43:52.449254    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:52.449385    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:52.449479    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:52.449570    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:52.449656    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:52.449784    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:43:52.449933    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0831 15:43:52.449941    4003 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:43:52.500604    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725144232.566642995
	
	I0831 15:43:52.500618    4003 fix.go:216] guest clock: 1725144232.566642995
	I0831 15:43:52.500629    4003 fix.go:229] Guest: 2024-08-31 15:43:52.566642995 -0700 PDT Remote: 2024-08-31 15:43:52.449243 -0700 PDT m=+56.586086649 (delta=117.399995ms)
	I0831 15:43:52.500641    4003 fix.go:200] guest clock delta is within tolerance: 117.399995ms
	I0831 15:43:52.500644    4003 start.go:83] releasing machines lock for "ha-949000-m02", held for 37.60661602s
	I0831 15:43:52.500661    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:52.500790    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:43:52.524083    4003 out.go:177] * Found network options:
	I0831 15:43:52.545377    4003 out.go:177]   - NO_PROXY=192.169.0.5
	W0831 15:43:52.567312    4003 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:43:52.567341    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:52.567964    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:52.568161    4003 main.go:141] libmachine: (ha-949000-m02) Calling .DriverName
	I0831 15:43:52.568240    4003 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:43:52.568275    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	W0831 15:43:52.568384    4003 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:43:52.568419    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:52.568477    4003 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:43:52.568494    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHHostname
	I0831 15:43:52.568580    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:52.568637    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHPort
	I0831 15:43:52.568715    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:52.568763    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHKeyPath
	I0831 15:43:52.568895    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetSSHUsername
	I0831 15:43:52.568930    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	I0831 15:43:52.569064    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m02/id_rsa Username:docker}
	W0831 15:43:52.598238    4003 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:43:52.598301    4003 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:43:52.641479    4003 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:43:52.641502    4003 start.go:495] detecting cgroup driver to use...
	I0831 15:43:52.641620    4003 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:43:52.657762    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:43:52.666682    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:43:52.675584    4003 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:43:52.675632    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:43:52.684590    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:43:52.693450    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:43:52.702203    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:43:52.711110    4003 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:43:52.720178    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:43:52.729030    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:43:52.738456    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:43:52.748149    4003 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:43:52.756790    4003 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:43:52.765391    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:52.862859    4003 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:43:52.883299    4003 start.go:495] detecting cgroup driver to use...
	I0831 15:43:52.883366    4003 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:43:52.900841    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:43:52.911689    4003 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:43:52.925373    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:43:52.936790    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:43:52.947768    4003 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:43:52.969807    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:43:52.980241    4003 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:43:52.995125    4003 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:43:52.998026    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:43:53.005290    4003 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:43:53.018832    4003 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:43:53.124064    4003 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:43:53.226798    4003 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:43:53.226820    4003 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:43:53.241337    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:53.342509    4003 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:43:55.695532    4003 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.352978813s)
	I0831 15:43:55.695593    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:43:55.706164    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:43:55.716443    4003 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:43:55.813069    4003 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:43:55.914225    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:56.017829    4003 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:43:56.031977    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:43:56.043082    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:56.147482    4003 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:43:56.211631    4003 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:43:56.211708    4003 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:43:56.216202    4003 start.go:563] Will wait 60s for crictl version
	I0831 15:43:56.216251    4003 ssh_runner.go:195] Run: which crictl
	I0831 15:43:56.223176    4003 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:43:56.247497    4003 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:43:56.247568    4003 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:43:56.264978    4003 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:43:56.322638    4003 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:43:56.344590    4003 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:43:56.365748    4003 main.go:141] libmachine: (ha-949000-m02) Calling .GetIP
	I0831 15:43:56.366152    4003 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:43:56.370681    4003 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:43:56.380351    4003 mustload.go:65] Loading cluster: ha-949000
	I0831 15:43:56.380517    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:43:56.380743    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:43:56.380758    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:43:56.389551    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52095
	I0831 15:43:56.390006    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:43:56.390330    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:43:56.390341    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:43:56.390567    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:43:56.390683    4003 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:43:56.390760    4003 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:43:56.390827    4003 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 4017
	I0831 15:43:56.391784    4003 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:43:56.392030    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:43:56.392047    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:43:56.400646    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52097
	I0831 15:43:56.401071    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:43:56.401432    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:43:56.401449    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:43:56.401654    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:43:56.401763    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:43:56.401863    4003 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.6
	I0831 15:43:56.401868    4003 certs.go:194] generating shared ca certs ...
	I0831 15:43:56.401876    4003 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:43:56.402014    4003 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:43:56.402069    4003 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:43:56.402077    4003 certs.go:256] generating profile certs ...
	I0831 15:43:56.402165    4003 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key
	I0831 15:43:56.402256    4003 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key.2cd83952
	I0831 15:43:56.402304    4003 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key
	I0831 15:43:56.402311    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:43:56.402331    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:43:56.402351    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:43:56.402368    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:43:56.402387    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 15:43:56.402405    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 15:43:56.402427    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 15:43:56.402445    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 15:43:56.402522    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:43:56.402560    4003 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:43:56.402572    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:43:56.402605    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:43:56.402639    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:43:56.402671    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:43:56.402737    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:43:56.402769    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:56.402811    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:43:56.402831    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:43:56.402857    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHHostname
	I0831 15:43:56.402948    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHPort
	I0831 15:43:56.403031    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHKeyPath
	I0831 15:43:56.403124    4003 main.go:141] libmachine: (ha-949000) Calling .GetSSHUsername
	I0831 15:43:56.403213    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000/id_rsa Username:docker}
	I0831 15:43:56.428694    4003 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0831 15:43:56.431875    4003 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0831 15:43:56.440490    4003 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0831 15:43:56.443670    4003 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0831 15:43:56.452165    4003 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0831 15:43:56.455207    4003 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0831 15:43:56.463624    4003 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0831 15:43:56.466671    4003 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1679 bytes)
	I0831 15:43:56.475535    4003 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0831 15:43:56.478615    4003 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0831 15:43:56.487110    4003 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0831 15:43:56.490238    4003 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0831 15:43:56.498895    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:43:56.519238    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:43:56.539011    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:43:56.558598    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:43:56.578234    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1432 bytes)
	I0831 15:43:56.597888    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0831 15:43:56.617519    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 15:43:56.637284    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 15:43:56.657084    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:43:56.676448    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:43:56.696310    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:43:56.715741    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0831 15:43:56.729513    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0831 15:43:56.743001    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0831 15:43:56.756453    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1679 bytes)
	I0831 15:43:56.770115    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0831 15:43:56.784073    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0831 15:43:56.797658    4003 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0831 15:43:56.810908    4003 ssh_runner.go:195] Run: openssl version
	I0831 15:43:56.815001    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:43:56.823241    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:56.826641    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:56.826682    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:43:56.830949    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:43:56.839331    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:43:56.847777    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:43:56.851154    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:43:56.851190    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:43:56.855448    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:43:56.863829    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:43:56.872178    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:43:56.875731    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:43:56.875765    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:43:56.879995    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:43:56.888471    4003 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:43:56.892039    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0831 15:43:56.896510    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0831 15:43:56.900794    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0831 15:43:56.904975    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0831 15:43:56.909175    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0831 15:43:56.913367    4003 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0831 15:43:56.917519    4003 kubeadm.go:934] updating node {m02 192.169.0.6 8443 v1.31.0 docker true true} ...
	I0831 15:43:56.917575    4003 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.6
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:43:56.917596    4003 kube-vip.go:115] generating kube-vip config ...
	I0831 15:43:56.917626    4003 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0831 15:43:56.929983    4003 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0831 15:43:56.930030    4003 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0831 15:43:56.930087    4003 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:43:56.938650    4003 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:43:56.938693    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0831 15:43:56.948188    4003 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:43:56.962082    4003 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:43:56.975374    4003 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0831 15:43:56.989089    4003 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:43:56.991924    4003 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:43:57.001250    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:57.094190    4003 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:43:57.108747    4003 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 15:43:57.108933    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:43:57.130218    4003 out.go:177] * Verifying Kubernetes components...
	I0831 15:43:57.171663    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:43:57.293447    4003 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:43:57.304999    4003 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:43:57.305203    4003 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52edc00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:43:57.305240    4003 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:43:57.305411    4003 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m02" to be "Ready" ...
	I0831 15:43:57.305492    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:43:57.305497    4003 round_trippers.go:469] Request Headers:
	I0831 15:43:57.305505    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:43:57.305514    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:05.959710    4003 round_trippers.go:574] Response Status: 200 OK in 8654 milliseconds
	I0831 15:44:05.960438    4003 node_ready.go:49] node "ha-949000-m02" has status "Ready":"True"
	I0831 15:44:05.960449    4003 node_ready.go:38] duration metric: took 8.6549293s for node "ha-949000-m02" to be "Ready" ...
	I0831 15:44:05.960456    4003 pod_ready.go:36] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:44:05.960491    4003 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I0831 15:44:05.960499    4003 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I0831 15:44:05.960533    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:44:05.960537    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:05.960545    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:05.960552    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:05.970871    4003 round_trippers.go:574] Response Status: 200 OK in 10 milliseconds
	I0831 15:44:05.978345    4003 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:05.978408    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-kjszm
	I0831 15:44:05.978422    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:05.978429    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:05.978433    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:05.985369    4003 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:44:05.985815    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:05.985824    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:05.985830    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:05.985833    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:05.991184    4003 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0831 15:44:05.991513    4003 pod_ready.go:93] pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:05.991523    4003 pod_ready.go:82] duration metric: took 13.160164ms for pod "coredns-6f6b679f8f-kjszm" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:05.991530    4003 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:05.991572    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-snq8s
	I0831 15:44:05.991577    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:05.991582    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:05.991587    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.000332    4003 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0831 15:44:06.000855    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:06.000863    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.000872    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.000878    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.013265    4003 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0831 15:44:06.013530    4003 pod_ready.go:93] pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:06.013539    4003 pod_ready.go:82] duration metric: took 22.004461ms for pod "coredns-6f6b679f8f-snq8s" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.013546    4003 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.013590    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000
	I0831 15:44:06.013595    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.013601    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.013603    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.020268    4003 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0831 15:44:06.020643    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:06.020651    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.020657    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.020661    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.027711    4003 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0831 15:44:06.028254    4003 pod_ready.go:93] pod "etcd-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:06.028264    4003 pod_ready.go:82] duration metric: took 14.711969ms for pod "etcd-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.028272    4003 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.028311    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m02
	I0831 15:44:06.028316    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.028322    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.028326    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.039178    4003 round_trippers.go:574] Response Status: 200 OK in 10 milliseconds
	I0831 15:44:06.039603    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:06.039612    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.039618    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.039621    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.041381    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:06.041651    4003 pod_ready.go:93] pod "etcd-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:06.041661    4003 pod_ready.go:82] duration metric: took 13.384756ms for pod "etcd-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.041667    4003 pod_ready.go:79] waiting up to 6m0s for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.041704    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-949000-m03
	I0831 15:44:06.041709    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.041715    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.041718    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.043280    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:06.161143    4003 request.go:632] Waited for 117.478694ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:06.161211    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:06.161216    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.161222    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.161225    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.165879    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:44:06.166023    4003 pod_ready.go:98] node "ha-949000-m03" hosting pod "etcd-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:06.166034    4003 pod_ready.go:82] duration metric: took 124.360492ms for pod "etcd-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	E0831 15:44:06.166042    4003 pod_ready.go:67] WaitExtra: waitPodCondition: node "ha-949000-m03" hosting pod "etcd-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:06.166052    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.361793    4003 request.go:632] Waited for 195.664438ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:44:06.361828    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000
	I0831 15:44:06.361833    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.361839    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.361847    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.363761    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:06.561193    4003 request.go:632] Waited for 196.830957ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:06.561252    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:06.561266    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.561279    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.561292    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.564567    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:06.565116    4003 pod_ready.go:93] pod "kube-apiserver-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:06.565128    4003 pod_ready.go:82] duration metric: took 399.063144ms for pod "kube-apiserver-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.565137    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.761258    4003 request.go:632] Waited for 195.975667ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:44:06.761325    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m02
	I0831 15:44:06.761334    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.761351    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.761363    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.764874    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:06.960633    4003 request.go:632] Waited for 195.219559ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:06.960666    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:06.960695    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:06.960702    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:06.960706    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:06.966407    4003 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0831 15:44:06.966698    4003 pod_ready.go:93] pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:06.966707    4003 pod_ready.go:82] duration metric: took 401.560896ms for pod "kube-apiserver-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:06.966714    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:07.161478    4003 request.go:632] Waited for 194.704872ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:44:07.161625    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-949000-m03
	I0831 15:44:07.161636    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:07.161647    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:07.161656    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:07.165538    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:07.361967    4003 request.go:632] Waited for 195.95763ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:07.362001    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:07.362006    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:07.362012    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:07.362016    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:07.363942    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:44:07.364015    4003 pod_ready.go:98] node "ha-949000-m03" hosting pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:07.364027    4003 pod_ready.go:82] duration metric: took 397.303245ms for pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	E0831 15:44:07.364034    4003 pod_ready.go:67] WaitExtra: waitPodCondition: node "ha-949000-m03" hosting pod "kube-apiserver-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:07.364047    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:07.561375    4003 request.go:632] Waited for 197.282382ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:44:07.561418    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000
	I0831 15:44:07.561424    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:07.561430    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:07.561434    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:07.563374    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:07.761585    4003 request.go:632] Waited for 197.505917ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:07.761680    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:07.761692    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:07.761703    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:07.761710    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:07.765076    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:07.765411    4003 pod_ready.go:93] pod "kube-controller-manager-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:07.765423    4003 pod_ready.go:82] duration metric: took 401.363562ms for pod "kube-controller-manager-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:07.765432    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:07.961150    4003 request.go:632] Waited for 195.676394ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:44:07.961210    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m02
	I0831 15:44:07.961216    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:07.961223    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:07.961232    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:07.963936    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:08.160774    4003 request.go:632] Waited for 196.46087ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:08.160847    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:08.160855    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:08.160863    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:08.160885    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:08.163147    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:08.163737    4003 pod_ready.go:93] pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:08.163748    4003 pod_ready.go:82] duration metric: took 398.305248ms for pod "kube-controller-manager-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:08.163756    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:08.360946    4003 request.go:632] Waited for 197.148459ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:44:08.361013    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-949000-m03
	I0831 15:44:08.361018    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:08.361025    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:08.361030    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:08.363306    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:08.561349    4003 request.go:632] Waited for 197.594231ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:08.561505    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:08.561518    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:08.561529    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:08.561536    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:08.564572    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:44:08.564661    4003 pod_ready.go:98] node "ha-949000-m03" hosting pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:08.564674    4003 pod_ready.go:82] duration metric: took 400.906717ms for pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	E0831 15:44:08.564683    4003 pod_ready.go:67] WaitExtra: waitPodCondition: node "ha-949000-m03" hosting pod "kube-controller-manager-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:08.564694    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:08.760636    4003 request.go:632] Waited for 195.893531ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:08.760715    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:08.760720    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:08.760726    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:08.760729    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:08.763646    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:08.961865    4003 request.go:632] Waited for 197.701917ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:08.961922    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:08.961933    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:08.961945    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:08.961952    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:08.964688    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:09.160991    4003 request.go:632] Waited for 95.682906ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:09.161056    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:09.161066    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:09.161078    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:09.161088    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:09.164212    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:09.360988    4003 request.go:632] Waited for 196.217621ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:09.361022    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:09.361027    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:09.361055    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:09.361059    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:09.363713    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:09.564888    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:09.564900    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:09.564907    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:09.564913    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:09.568623    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:09.760895    4003 request.go:632] Waited for 191.666981ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:09.760944    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:09.760952    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:09.760958    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:09.760962    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:09.763257    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:10.065958    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:10.065982    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:10.065993    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:10.065998    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:10.069180    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:10.162666    4003 request.go:632] Waited for 93.035977ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:10.162750    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:10.162767    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:10.162780    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:10.162786    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:10.165653    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:10.565356    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:10.565380    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:10.565391    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:10.565397    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:10.568883    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:10.569642    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:10.569650    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:10.569655    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:10.569658    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:10.571069    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:10.571366    4003 pod_ready.go:103] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"False"
	I0831 15:44:11.066968    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:11.066994    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:11.067006    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:11.067011    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:11.070763    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:11.071322    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:11.071330    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:11.071335    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:11.071339    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:11.072824    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:11.565282    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:11.565303    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:11.565314    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:11.565320    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:11.568672    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:11.569364    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:11.569371    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:11.569378    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:11.569381    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:11.571110    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:12.065991    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:12.066013    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:12.066025    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:12.066038    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:12.070105    4003 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:44:12.070531    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:12.070540    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:12.070548    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:12.070553    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:12.072400    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:12.566716    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:12.566745    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:12.566756    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:12.566762    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:12.570548    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:12.570980    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:12.570991    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:12.571000    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:12.571005    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:12.573075    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:12.573392    4003 pod_ready.go:103] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"False"
	I0831 15:44:13.065503    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:13.065529    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:13.065540    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:13.065545    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:13.069028    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:13.069606    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:13.069616    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:13.069624    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:13.069628    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:13.071291    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:13.566706    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:13.566719    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:13.566724    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:13.566727    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:13.568695    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:13.569316    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:13.569324    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:13.569330    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:13.569340    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:13.570910    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:14.066070    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:14.066097    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:14.066110    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:14.066122    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:14.069846    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:14.070280    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:14.070288    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:14.070294    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:14.070298    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:14.071983    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:14.565072    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:14.565092    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:14.565103    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:14.565121    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:14.568991    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:14.569470    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:14.569478    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:14.569486    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:14.569489    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:14.571194    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:15.065570    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:15.065590    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:15.065602    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:15.065608    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:15.069259    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:15.069742    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:15.069750    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:15.069756    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:15.069761    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:15.071256    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:15.071608    4003 pod_ready.go:103] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"False"
	I0831 15:44:15.565664    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:15.565722    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:15.565736    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:15.565743    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:15.568446    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:15.568953    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:15.568960    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:15.568966    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:15.568969    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:15.570393    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:16.066647    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:16.066673    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:16.066683    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:16.066689    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:16.069968    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:16.070667    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:16.070678    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:16.070686    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:16.070700    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:16.072421    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:16.565080    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:16.565093    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:16.565100    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:16.565105    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:16.567016    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:16.567805    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:16.567814    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:16.567819    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:16.567829    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:16.569508    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:17.065211    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:17.065233    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:17.065243    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:17.065249    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:17.068848    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:17.069431    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:17.069442    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:17.069451    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:17.069454    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:17.071237    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:17.565694    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:17.565715    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:17.565726    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:17.565732    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:17.569041    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:17.569625    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:17.569632    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:17.569638    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:17.569648    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:17.571537    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:17.572005    4003 pod_ready.go:103] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"False"
	I0831 15:44:18.065338    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:18.065353    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:18.065361    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:18.065365    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:18.067574    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:18.067956    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:18.067963    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:18.067969    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:18.067973    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:18.069437    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:18.565941    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:18.565963    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:18.565974    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:18.565984    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:18.569115    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:18.569832    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:18.569842    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:18.569850    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:18.569854    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:18.571574    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:19.065517    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:19.065533    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:19.065540    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:19.065545    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:19.068125    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:19.068655    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:19.068662    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:19.068667    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:19.068672    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:19.070197    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:19.566293    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:19.566372    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:19.566385    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:19.566395    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:19.569750    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:19.570211    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:19.570219    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:19.570224    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:19.570229    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:19.571922    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:19.572415    4003 pod_ready.go:103] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"False"
	I0831 15:44:20.065051    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:20.065066    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:20.065073    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:20.065078    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:20.070133    4003 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0831 15:44:20.070557    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:20.070565    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:20.070570    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:20.070573    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:20.072277    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:20.566009    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:20.566031    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:20.566042    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:20.566051    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:20.570001    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:20.570447    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:20.570453    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:20.570458    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:20.570460    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:20.572199    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:21.065187    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:21.065210    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:21.065222    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:21.065227    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:21.067898    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:21.068345    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:21.068353    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:21.068358    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:21.068362    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:21.069742    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:21.565705    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:21.565724    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:21.565735    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:21.565741    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:21.568938    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:21.569630    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:21.569641    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:21.569647    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:21.569655    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:21.571194    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:22.065027    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:22.065051    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:22.065062    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:22.065100    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:22.069375    4003 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:44:22.069729    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:22.069737    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:22.069743    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:22.069747    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:22.072208    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:22.072476    4003 pod_ready.go:103] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"False"
	I0831 15:44:22.565894    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:22.565917    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:22.565928    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:22.565937    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:22.569490    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:22.569886    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:22.569893    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:22.569899    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:22.569903    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:22.571462    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:23.066179    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:23.066201    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:23.066212    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:23.066219    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:23.070218    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:23.070845    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:23.070855    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:23.070862    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:23.070867    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:23.072481    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:23.565085    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:23.565099    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:23.565105    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:23.565109    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:23.567899    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:23.568397    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:23.568405    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:23.568411    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:23.568431    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:23.571121    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:24.065227    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:24.065249    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:24.065261    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:24.065270    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:24.068196    4003 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 15:44:24.068774    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:24.068782    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:24.068787    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:24.068791    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:24.070317    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:24.565319    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:24.565332    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:24.565337    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:24.565340    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:24.567104    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:24.567586    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:24.567594    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:24.567600    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:24.567603    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:24.569279    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:24.569664    4003 pod_ready.go:103] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"False"
	I0831 15:44:25.066218    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-4r2bt
	I0831 15:44:25.066244    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.066255    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.066260    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.069969    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:25.070824    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:25.070848    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.070854    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.070863    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.072406    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.072709    4003 pod_ready.go:93] pod "kube-proxy-4r2bt" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:25.072718    4003 pod_ready.go:82] duration metric: took 16.507839534s for pod "kube-proxy-4r2bt" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.072727    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.072755    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-d45q5
	I0831 15:44:25.072760    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.072765    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.072769    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.074170    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.074584    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:25.074591    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.074596    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.074599    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.076066    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:44:25.076162    4003 pod_ready.go:98] node "ha-949000-m03" hosting pod "kube-proxy-d45q5" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:25.076170    4003 pod_ready.go:82] duration metric: took 3.437579ms for pod "kube-proxy-d45q5" in "kube-system" namespace to be "Ready" ...
	E0831 15:44:25.076175    4003 pod_ready.go:67] WaitExtra: waitPodCondition: node "ha-949000-m03" hosting pod "kube-proxy-d45q5" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:25.076179    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.076206    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-q7ndn
	I0831 15:44:25.076211    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.076216    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.076219    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.077746    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.078120    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:25.078127    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.078133    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.078136    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.079498    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.079894    4003 pod_ready.go:93] pod "kube-proxy-q7ndn" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:25.079903    4003 pod_ready.go:82] duration metric: took 3.717598ms for pod "kube-proxy-q7ndn" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.079909    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.079936    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000
	I0831 15:44:25.079941    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.079946    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.079951    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.081600    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.081932    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000
	I0831 15:44:25.081940    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.081946    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.081949    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.083262    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.083552    4003 pod_ready.go:93] pod "kube-scheduler-ha-949000" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:25.083561    4003 pod_ready.go:82] duration metric: took 3.647661ms for pod "kube-scheduler-ha-949000" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.083567    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.083594    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m02
	I0831 15:44:25.083603    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.083609    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.083614    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.085111    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.085438    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m02
	I0831 15:44:25.085446    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.085452    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.085455    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.087068    4003 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 15:44:25.087348    4003 pod_ready.go:93] pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace has status "Ready":"True"
	I0831 15:44:25.087357    4003 pod_ready.go:82] duration metric: took 3.784951ms for pod "kube-scheduler-ha-949000-m02" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.087363    4003 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	I0831 15:44:25.267294    4003 request.go:632] Waited for 179.857802ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:44:25.267367    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-949000-m03
	I0831 15:44:25.267377    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.267389    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.267395    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.271053    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:25.466554    4003 request.go:632] Waited for 195.015611ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:25.466691    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m03
	I0831 15:44:25.466701    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.466712    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.466721    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.470050    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:44:25.470127    4003 pod_ready.go:98] node "ha-949000-m03" hosting pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:25.470148    4003 pod_ready.go:82] duration metric: took 382.775358ms for pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace to be "Ready" ...
	E0831 15:44:25.470158    4003 pod_ready.go:67] WaitExtra: waitPodCondition: node "ha-949000-m03" hosting pod "kube-scheduler-ha-949000-m03" in "kube-system" namespace is currently not "Ready" (skipping!): error getting node "ha-949000-m03": nodes "ha-949000-m03" not found
	I0831 15:44:25.470165    4003 pod_ready.go:39] duration metric: took 19.509491295s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 15:44:25.470190    4003 api_server.go:52] waiting for apiserver process to appear ...
	I0831 15:44:25.470257    4003 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:44:25.483780    4003 api_server.go:72] duration metric: took 28.374703678s to wait for apiserver process to appear ...
	I0831 15:44:25.483792    4003 api_server.go:88] waiting for apiserver healthz status ...
	I0831 15:44:25.483807    4003 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0831 15:44:25.486833    4003 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0831 15:44:25.486870    4003 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0831 15:44:25.486875    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.486882    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.486887    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.487354    4003 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 15:44:25.487409    4003 api_server.go:141] control plane version: v1.31.0
	I0831 15:44:25.487417    4003 api_server.go:131] duration metric: took 3.620759ms to wait for apiserver health ...
	I0831 15:44:25.487424    4003 system_pods.go:43] waiting for kube-system pods to appear ...
	I0831 15:44:25.666509    4003 request.go:632] Waited for 179.03877ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:44:25.666550    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:44:25.666557    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.666565    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.666601    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.670513    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:25.675202    4003 system_pods.go:59] 24 kube-system pods found
	I0831 15:44:25.675220    4003 system_pods.go:61] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:44:25.675225    4003 system_pods.go:61] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:44:25.675229    4003 system_pods.go:61] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:44:25.675232    4003 system_pods.go:61] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:44:25.675236    4003 system_pods.go:61] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:44:25.675238    4003 system_pods.go:61] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:44:25.675241    4003 system_pods.go:61] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:44:25.675244    4003 system_pods.go:61] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:44:25.675247    4003 system_pods.go:61] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:44:25.675249    4003 system_pods.go:61] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:44:25.675252    4003 system_pods.go:61] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:44:25.675255    4003 system_pods.go:61] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:44:25.675258    4003 system_pods.go:61] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:44:25.675261    4003 system_pods.go:61] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:44:25.675263    4003 system_pods.go:61] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:44:25.675266    4003 system_pods.go:61] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:44:25.675268    4003 system_pods.go:61] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:44:25.675271    4003 system_pods.go:61] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:44:25.675274    4003 system_pods.go:61] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:44:25.675280    4003 system_pods.go:61] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:44:25.675283    4003 system_pods.go:61] "kube-vip-ha-949000" [98967a2c-6641-4193-b7ce-c0fbdee58344] Running
	I0831 15:44:25.675286    4003 system_pods.go:61] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:44:25.675288    4003 system_pods.go:61] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:44:25.675292    4003 system_pods.go:61] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:44:25.675296    4003 system_pods.go:74] duration metric: took 187.866388ms to wait for pod list to return data ...
	I0831 15:44:25.675301    4003 default_sa.go:34] waiting for default service account to be created ...
	I0831 15:44:25.866631    4003 request.go:632] Waited for 191.264353ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:44:25.866761    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0831 15:44:25.866771    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:25.866783    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:25.866789    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:25.870307    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:25.870649    4003 default_sa.go:45] found service account: "default"
	I0831 15:44:25.870663    4003 default_sa.go:55] duration metric: took 195.354455ms for default service account to be created ...
	I0831 15:44:25.870670    4003 system_pods.go:116] waiting for k8s-apps to be running ...
	I0831 15:44:26.067233    4003 request.go:632] Waited for 196.47603ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:44:26.067280    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0831 15:44:26.067290    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:26.067301    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:26.067307    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:26.072107    4003 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 15:44:26.077415    4003 system_pods.go:86] 24 kube-system pods found
	I0831 15:44:26.077426    4003 system_pods.go:89] "coredns-6f6b679f8f-kjszm" [8d58b21f-98f4-48f6-a2fa-60b880e045df] Running
	I0831 15:44:26.077431    4003 system_pods.go:89] "coredns-6f6b679f8f-snq8s" [7df21163-affb-4e72-812c-a662e9b8d69b] Running
	I0831 15:44:26.077435    4003 system_pods.go:89] "etcd-ha-949000" [11dd683e-70ae-4025-8b1b-bc7f24a8dd9f] Running
	I0831 15:44:26.077439    4003 system_pods.go:89] "etcd-ha-949000-m02" [072c3f73-c6a7-42cf-a2db-c7322d666afb] Running
	I0831 15:44:26.077442    4003 system_pods.go:89] "etcd-ha-949000-m03" [00f31422-15f3-46aa-8805-651d2e0defb9] Running
	I0831 15:44:26.077446    4003 system_pods.go:89] "kindnet-9j85v" [af2dac08-1f4f-49ed-999e-b4d10ff22c2c] Running
	I0831 15:44:26.077448    4003 system_pods.go:89] "kindnet-brtj6" [7c27f09c-99ee-438b-9c03-07ad8986c32b] Running
	I0831 15:44:26.077451    4003 system_pods.go:89] "kindnet-jzj42" [1f3f503b-44ec-4332-84cb-ddba5f4bfb13] Running
	I0831 15:44:26.077454    4003 system_pods.go:89] "kube-apiserver-ha-949000" [6c30e803-6443-4d66-9210-fd065ba8fd4f] Running
	I0831 15:44:26.077459    4003 system_pods.go:89] "kube-apiserver-ha-949000-m02" [602fdc7d-d3b4-4937-9eb7-62a6a58b3d17] Running
	I0831 15:44:26.077462    4003 system_pods.go:89] "kube-apiserver-ha-949000-m03" [a922a4b2-8cc9-4c31-b00b-c9923a51472e] Running
	I0831 15:44:26.077467    4003 system_pods.go:89] "kube-controller-manager-ha-949000" [96efb4c9-4a9d-402b-8524-73f86b775d6e] Running
	I0831 15:44:26.077470    4003 system_pods.go:89] "kube-controller-manager-ha-949000-m02" [08d3fdc3-40a6-4666-bd1b-798afb26eecb] Running
	I0831 15:44:26.077473    4003 system_pods.go:89] "kube-controller-manager-ha-949000-m03" [2d4c4c7f-b540-4f83-9d8a-48d031e14873] Running
	I0831 15:44:26.077477    4003 system_pods.go:89] "kube-proxy-4r2bt" [84ea931a-0c2c-43a7-bf18-3aa5062cdc8e] Running
	I0831 15:44:26.077479    4003 system_pods.go:89] "kube-proxy-d45q5" [9d7251d8-af8a-4a2e-b3c9-a16cd981fcf2] Running
	I0831 15:44:26.077482    4003 system_pods.go:89] "kube-proxy-q7ndn" [9caa8816-ece3-4a7e-b4e1-64ae0769d450] Running
	I0831 15:44:26.077485    4003 system_pods.go:89] "kube-scheduler-ha-949000" [db20baa3-3ae4-4318-bb87-e97fb80c1074] Running
	I0831 15:44:26.077488    4003 system_pods.go:89] "kube-scheduler-ha-949000-m02" [2dc28f40-c8f7-4de2-b25f-939a94b80cca] Running
	I0831 15:44:26.077491    4003 system_pods.go:89] "kube-scheduler-ha-949000-m03" [2c394308-3e00-482a-85c3-ced3e86e0d52] Running
	I0831 15:44:26.077494    4003 system_pods.go:89] "kube-vip-ha-949000" [98967a2c-6641-4193-b7ce-c0fbdee58344] Running
	I0831 15:44:26.077497    4003 system_pods.go:89] "kube-vip-ha-949000-m02" [2af174e1-a5f0-49c8-aadd-13d8c1b4068f] Running
	I0831 15:44:26.077499    4003 system_pods.go:89] "kube-vip-ha-949000-m03" [a30f45e2-f2ac-4a28-a3af-5c0189352f9f] Running
	I0831 15:44:26.077502    4003 system_pods.go:89] "storage-provisioner" [03bcdd23-f7f2-45a9-ab95-91918e094226] Running
	I0831 15:44:26.077506    4003 system_pods.go:126] duration metric: took 206.829ms to wait for k8s-apps to be running ...
	I0831 15:44:26.077512    4003 system_svc.go:44] waiting for kubelet service to be running ....
	I0831 15:44:26.077564    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:44:26.088970    4003 system_svc.go:56] duration metric: took 11.450852ms WaitForService to wait for kubelet
	I0831 15:44:26.088985    4003 kubeadm.go:582] duration metric: took 28.979903586s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 15:44:26.088998    4003 node_conditions.go:102] verifying NodePressure condition ...
	I0831 15:44:26.266791    4003 request.go:632] Waited for 177.710266ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0831 15:44:26.266867    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0831 15:44:26.266875    4003 round_trippers.go:469] Request Headers:
	I0831 15:44:26.266886    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:44:26.266896    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:44:26.270407    4003 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 15:44:26.271146    4003 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:44:26.271167    4003 node_conditions.go:123] node cpu capacity is 2
	I0831 15:44:26.271180    4003 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 15:44:26.271188    4003 node_conditions.go:123] node cpu capacity is 2
	I0831 15:44:26.271193    4003 node_conditions.go:105] duration metric: took 182.189243ms to run NodePressure ...
	I0831 15:44:26.271203    4003 start.go:241] waiting for startup goroutines ...
	I0831 15:44:26.271229    4003 start.go:255] writing updated cluster config ...
	I0831 15:44:26.293325    4003 out.go:201] 
	I0831 15:44:26.315324    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:44:26.315453    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:44:26.337988    4003 out.go:177] * Starting "ha-949000-m04" worker node in "ha-949000" cluster
	I0831 15:44:26.380685    4003 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:44:26.380719    4003 cache.go:56] Caching tarball of preloaded images
	I0831 15:44:26.380921    4003 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 15:44:26.380941    4003 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 15:44:26.381080    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:44:26.382207    4003 start.go:360] acquireMachinesLock for ha-949000-m04: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 15:44:26.382290    4003 start.go:364] duration metric: took 66.399µs to acquireMachinesLock for "ha-949000-m04"
	I0831 15:44:26.382307    4003 start.go:96] Skipping create...Using existing machine configuration
	I0831 15:44:26.382314    4003 fix.go:54] fixHost starting: m04
	I0831 15:44:26.382612    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:44:26.382638    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:44:26.391652    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52102
	I0831 15:44:26.391986    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:44:26.392342    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:44:26.392365    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:44:26.392613    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:44:26.392733    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:44:26.392824    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetState
	I0831 15:44:26.392912    4003 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:44:26.392996    4003 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3806
	I0831 15:44:26.393933    4003 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid 3806 missing from process table
	I0831 15:44:26.393956    4003 fix.go:112] recreateIfNeeded on ha-949000-m04: state=Stopped err=<nil>
	I0831 15:44:26.393965    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	W0831 15:44:26.394099    4003 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 15:44:26.414853    4003 out.go:177] * Restarting existing hyperkit VM for "ha-949000-m04" ...
	I0831 15:44:26.456728    4003 main.go:141] libmachine: (ha-949000-m04) Calling .Start
	I0831 15:44:26.457073    4003 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:44:26.457142    4003 main.go:141] libmachine: (ha-949000-m04) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid
	I0831 15:44:26.457233    4003 main.go:141] libmachine: (ha-949000-m04) DBG | Using UUID 5ee34770-2239-4427-9789-bd204fe095a6
	I0831 15:44:26.482643    4003 main.go:141] libmachine: (ha-949000-m04) DBG | Generated MAC 8a:3c:61:5f:c5:84
	I0831 15:44:26.482668    4003 main.go:141] libmachine: (ha-949000-m04) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000
	I0831 15:44:26.482825    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"5ee34770-2239-4427-9789-bd204fe095a6", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001201e0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:44:26.482873    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"5ee34770-2239-4427-9789-bd204fe095a6", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001201e0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0831 15:44:26.482921    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "5ee34770-2239-4427-9789-bd204fe095a6", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/ha-949000-m04.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-94
9000-m04/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"}
	I0831 15:44:26.482962    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 5ee34770-2239-4427-9789-bd204fe095a6 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/ha-949000-m04.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/initrd,earlyprintk=serial loglevel=3 console=ttyS0 co
nsole=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-949000"
	I0831 15:44:26.482975    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 15:44:26.484373    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 DEBUG: hyperkit: Pid is 4071
	I0831 15:44:26.484859    4003 main.go:141] libmachine: (ha-949000-m04) DBG | Attempt 0
	I0831 15:44:26.484876    4003 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:44:26.484959    4003 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 4071
	I0831 15:44:26.487135    4003 main.go:141] libmachine: (ha-949000-m04) DBG | Searching for 8a:3c:61:5f:c5:84 in /var/db/dhcpd_leases ...
	I0831 15:44:26.487196    4003 main.go:141] libmachine: (ha-949000-m04) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0831 15:44:26.487221    4003 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:92:7:3c:3f:ee:b7 ID:1,92:7:3c:3f:ee:b7 Lease:0x66d4ee0c}
	I0831 15:44:26.487236    4003 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:ce:8:77:f7:42:5e ID:1,ce:8:77:f7:42:5e Lease:0x66d4edf9}
	I0831 15:44:26.487249    4003 main.go:141] libmachine: (ha-949000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:8a:3c:61:5f:c5:84 ID:1,8a:3c:61:5f:c5:84 Lease:0x66d39c5e}
	I0831 15:44:26.487264    4003 main.go:141] libmachine: (ha-949000-m04) DBG | Found match: 8a:3c:61:5f:c5:84
	I0831 15:44:26.487276    4003 main.go:141] libmachine: (ha-949000-m04) DBG | IP: 192.169.0.8
	I0831 15:44:26.487302    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetConfigRaw
	I0831 15:44:26.488058    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:44:26.488267    4003 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/config.json ...
	I0831 15:44:26.488733    4003 machine.go:93] provisionDockerMachine start ...
	I0831 15:44:26.488743    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:44:26.488866    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:44:26.488967    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:44:26.489052    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:44:26.489152    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:44:26.489235    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:44:26.489342    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:44:26.489512    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:44:26.489524    4003 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 15:44:26.492093    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 15:44:26.500227    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 15:44:26.501190    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:44:26.501211    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:44:26.501222    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:44:26.501234    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:44:26.887163    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 15:44:26.887179    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:26 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 15:44:27.001897    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 15:44:27.001917    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 15:44:27.001935    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 15:44:27.001949    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:27 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 15:44:27.002783    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:27 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 15:44:27.002794    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:27 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 15:44:32.603005    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:32 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 15:44:32.603055    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:32 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 15:44:32.603066    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:32 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 15:44:32.626242    4003 main.go:141] libmachine: (ha-949000-m04) DBG | 2024/08/31 15:44:32 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 15:45:01.551772    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 15:45:01.551791    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetMachineName
	I0831 15:45:01.551924    4003 buildroot.go:166] provisioning hostname "ha-949000-m04"
	I0831 15:45:01.551935    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetMachineName
	I0831 15:45:01.552030    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:01.552119    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:01.552201    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.552291    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.552372    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:01.552497    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:45:01.552634    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:45:01.552642    4003 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-949000-m04 && echo "ha-949000-m04" | sudo tee /etc/hostname
	I0831 15:45:01.616885    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-949000-m04
	
	I0831 15:45:01.616906    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:01.617041    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:01.617145    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.617232    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.617317    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:01.617452    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:45:01.617606    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:45:01.617618    4003 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-949000-m04' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-949000-m04/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-949000-m04' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 15:45:01.675471    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 15:45:01.675486    4003 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 15:45:01.675499    4003 buildroot.go:174] setting up certificates
	I0831 15:45:01.675505    4003 provision.go:84] configureAuth start
	I0831 15:45:01.675512    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetMachineName
	I0831 15:45:01.675643    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:45:01.675763    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:01.675858    4003 provision.go:143] copyHostCerts
	I0831 15:45:01.675886    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:45:01.675959    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 15:45:01.675965    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 15:45:01.676118    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 15:45:01.676365    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:45:01.676407    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 15:45:01.676412    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 15:45:01.676500    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 15:45:01.676663    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:45:01.676709    4003 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 15:45:01.676714    4003 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 15:45:01.676793    4003 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 15:45:01.676940    4003 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.ha-949000-m04 san=[127.0.0.1 192.169.0.8 ha-949000-m04 localhost minikube]
	I0831 15:45:01.762314    4003 provision.go:177] copyRemoteCerts
	I0831 15:45:01.762367    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 15:45:01.762382    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:01.762557    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:01.762656    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.762756    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:01.762844    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:45:01.796205    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 15:45:01.796279    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 15:45:01.815211    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 15:45:01.815279    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0831 15:45:01.834188    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 15:45:01.834257    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0831 15:45:01.853640    4003 provision.go:87] duration metric: took 178.124085ms to configureAuth
	I0831 15:45:01.853653    4003 buildroot.go:189] setting minikube options for container-runtime
	I0831 15:45:01.853819    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:45:01.853832    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:45:01.853954    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:01.854036    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:01.854122    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.854210    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.854294    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:01.854407    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:45:01.854531    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:45:01.854538    4003 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 15:45:01.906456    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 15:45:01.906469    4003 buildroot.go:70] root file system type: tmpfs
	I0831 15:45:01.906548    4003 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 15:45:01.906561    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:01.906689    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:01.906786    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.906885    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.906960    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:01.907078    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:45:01.907226    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:45:01.907270    4003 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 15:45:01.970284    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 15:45:01.970303    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:01.970453    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:01.970548    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.970632    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:01.970725    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:01.970876    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:45:01.971019    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:45:01.971040    4003 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 15:45:03.516394    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 15:45:03.516410    4003 machine.go:96] duration metric: took 37.027272003s to provisionDockerMachine
	I0831 15:45:03.516419    4003 start.go:293] postStartSetup for "ha-949000-m04" (driver="hyperkit")
	I0831 15:45:03.516426    4003 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 15:45:03.516446    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:45:03.516635    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 15:45:03.516649    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:03.516745    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:03.516831    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:03.516911    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:03.517003    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:45:03.549510    4003 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 15:45:03.552575    4003 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 15:45:03.552586    4003 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 15:45:03.552685    4003 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 15:45:03.552861    4003 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 15:45:03.552868    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 15:45:03.553075    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 15:45:03.560251    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:45:03.579932    4003 start.go:296] duration metric: took 63.505056ms for postStartSetup
	I0831 15:45:03.579953    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:45:03.580123    4003 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0831 15:45:03.580137    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:03.580227    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:03.580304    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:03.580383    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:03.580463    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:45:03.613355    4003 machine.go:197] restoring vm config from /var/lib/minikube/backup: [etc]
	I0831 15:45:03.613415    4003 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0831 15:45:03.667593    4003 fix.go:56] duration metric: took 37.284874453s for fixHost
	I0831 15:45:03.667632    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:03.667887    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:03.668092    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:03.668253    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:03.668442    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:03.668679    4003 main.go:141] libmachine: Using SSH client type: native
	I0831 15:45:03.668942    4003 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3c31ea0] 0x3c34c00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0831 15:45:03.668957    4003 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 15:45:03.721925    4003 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725144303.791568584
	
	I0831 15:45:03.721940    4003 fix.go:216] guest clock: 1725144303.791568584
	I0831 15:45:03.721945    4003 fix.go:229] Guest: 2024-08-31 15:45:03.791568584 -0700 PDT Remote: 2024-08-31 15:45:03.667616 -0700 PDT m=+127.803695939 (delta=123.952584ms)
	I0831 15:45:03.721980    4003 fix.go:200] guest clock delta is within tolerance: 123.952584ms
	I0831 15:45:03.721984    4003 start.go:83] releasing machines lock for "ha-949000-m04", held for 37.339285395s
	I0831 15:45:03.722007    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:45:03.722145    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:45:03.745774    4003 out.go:177] * Found network options:
	I0831 15:45:03.767373    4003 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6
	W0831 15:45:03.788896    4003 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:45:03.788955    4003 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:45:03.788975    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:45:03.789814    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:45:03.790060    4003 main.go:141] libmachine: (ha-949000-m04) Calling .DriverName
	I0831 15:45:03.790166    4003 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 15:45:03.790203    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	W0831 15:45:03.790303    4003 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 15:45:03.790355    4003 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 15:45:03.790430    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:03.790462    4003 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 15:45:03.790479    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHHostname
	I0831 15:45:03.790581    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:03.790645    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHPort
	I0831 15:45:03.790761    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:03.790846    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHKeyPath
	I0831 15:45:03.790934    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	I0831 15:45:03.791028    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetSSHUsername
	I0831 15:45:03.791215    4003 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/ha-949000-m04/id_rsa Username:docker}
	W0831 15:45:03.820995    4003 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 15:45:03.821055    4003 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 15:45:03.865115    4003 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 15:45:03.865138    4003 start.go:495] detecting cgroup driver to use...
	I0831 15:45:03.865245    4003 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:45:03.881224    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 15:45:03.890437    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 15:45:03.899610    4003 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 15:45:03.899666    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 15:45:03.908938    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:45:03.918184    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 15:45:03.927312    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 15:45:03.936702    4003 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 15:45:03.946157    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 15:45:03.955222    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 15:45:03.964152    4003 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 15:45:03.973257    4003 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 15:45:03.981558    4003 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 15:45:03.989901    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:45:04.086014    4003 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 15:45:04.105538    4003 start.go:495] detecting cgroup driver to use...
	I0831 15:45:04.105610    4003 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 15:45:04.121430    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:45:04.134788    4003 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 15:45:04.151049    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 15:45:04.161844    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:45:04.172949    4003 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 15:45:04.191373    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 15:45:04.201771    4003 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 15:45:04.216770    4003 ssh_runner.go:195] Run: which cri-dockerd
	I0831 15:45:04.219760    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 15:45:04.226792    4003 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 15:45:04.240592    4003 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 15:45:04.340799    4003 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 15:45:04.439649    4003 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 15:45:04.439671    4003 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 15:45:04.453918    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:45:04.542337    4003 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 15:45:06.812888    4003 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.270508765s)
	I0831 15:45:06.812949    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 15:45:06.823181    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:45:06.833531    4003 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 15:45:06.936150    4003 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 15:45:07.044179    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:45:07.137898    4003 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 15:45:07.152258    4003 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 15:45:07.163263    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:45:07.258016    4003 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 15:45:07.318759    4003 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 15:45:07.318841    4003 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 15:45:07.323364    4003 start.go:563] Will wait 60s for crictl version
	I0831 15:45:07.323422    4003 ssh_runner.go:195] Run: which crictl
	I0831 15:45:07.326572    4003 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 15:45:07.358444    4003 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 15:45:07.358520    4003 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:45:07.376088    4003 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 15:45:07.414824    4003 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 15:45:07.456544    4003 out.go:177]   - env NO_PROXY=192.169.0.5
	I0831 15:45:07.477408    4003 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0831 15:45:07.498401    4003 main.go:141] libmachine: (ha-949000-m04) Calling .GetIP
	I0831 15:45:07.498760    4003 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 15:45:07.503179    4003 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:45:07.513368    4003 mustload.go:65] Loading cluster: ha-949000
	I0831 15:45:07.513553    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:45:07.513782    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:45:07.513810    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:45:07.522673    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52124
	I0831 15:45:07.523026    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:45:07.523408    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:45:07.523425    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:45:07.523666    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:45:07.523786    4003 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:45:07.523871    4003 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:45:07.523962    4003 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 4017
	I0831 15:45:07.524938    4003 host.go:66] Checking if "ha-949000" exists ...
	I0831 15:45:07.525205    4003 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:45:07.525236    4003 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:45:07.534543    4003 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52126
	I0831 15:45:07.534878    4003 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:45:07.535207    4003 main.go:141] libmachine: Using API Version  1
	I0831 15:45:07.535219    4003 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:45:07.535443    4003 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:45:07.535559    4003 main.go:141] libmachine: (ha-949000) Calling .DriverName
	I0831 15:45:07.535653    4003 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000 for IP: 192.169.0.8
	I0831 15:45:07.535660    4003 certs.go:194] generating shared ca certs ...
	I0831 15:45:07.535672    4003 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 15:45:07.535838    4003 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 15:45:07.535909    4003 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 15:45:07.535919    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 15:45:07.535943    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 15:45:07.535961    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 15:45:07.535978    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 15:45:07.536528    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 15:45:07.536755    4003 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 15:45:07.536797    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 15:45:07.536911    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 15:45:07.536985    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 15:45:07.537034    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 15:45:07.537191    4003 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 15:45:07.537301    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:45:07.537538    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 15:45:07.537562    4003 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 15:45:07.537590    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 15:45:07.557183    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 15:45:07.576458    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 15:45:07.595921    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 15:45:07.615402    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 15:45:07.634516    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 15:45:07.653693    4003 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 15:45:07.673154    4003 ssh_runner.go:195] Run: openssl version
	I0831 15:45:07.677604    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 15:45:07.686971    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:45:07.690415    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:45:07.690457    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 15:45:07.694634    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 15:45:07.703764    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 15:45:07.713184    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 15:45:07.716497    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 15:45:07.716528    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 15:45:07.720770    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 15:45:07.729910    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 15:45:07.739116    4003 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 15:45:07.742456    4003 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 15:45:07.742497    4003 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 15:45:07.746707    4003 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 15:45:07.755769    4003 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 15:45:07.758843    4003 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 15:45:07.758878    4003 kubeadm.go:934] updating node {m04 192.169.0.8 0 v1.31.0  false true} ...
	I0831 15:45:07.758938    4003 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-949000-m04 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.8
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:ha-949000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 15:45:07.758975    4003 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 15:45:07.767346    4003 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 15:45:07.767390    4003 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system
	I0831 15:45:07.775359    4003 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0831 15:45:07.788534    4003 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 15:45:07.801886    4003 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0831 15:45:07.804685    4003 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 15:45:07.814373    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:45:07.913307    4003 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:45:07.928102    4003 start.go:235] Will wait 6m0s for node &{Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime: ControlPlane:false Worker:true}
	I0831 15:45:07.928288    4003 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:45:07.970019    4003 out.go:177] * Verifying Kubernetes components...
	I0831 15:45:07.990872    4003 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 15:45:08.095722    4003 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 15:45:08.845027    4003 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:45:08.845280    4003 kapi.go:59] client config for ha-949000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/ha-949000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, U
serAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52edc00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0831 15:45:08.845323    4003 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0831 15:45:08.845512    4003 node_ready.go:35] waiting up to 6m0s for node "ha-949000-m04" to be "Ready" ...
	I0831 15:45:08.845557    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:08.845562    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:08.845568    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:08.845571    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:08.847724    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:09.347758    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:09.347784    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:09.347795    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:09.347801    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:09.351055    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:09.846963    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:09.846989    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:09.847000    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:09.847007    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:09.850830    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:10.346983    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:10.346994    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:10.347000    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:10.347004    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:10.349173    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:10.845886    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:10.845909    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:10.845920    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:10.845929    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:10.848792    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:10.848857    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:11.347504    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:11.347528    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:11.347539    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:11.347545    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:11.350440    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:11.846697    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:11.846722    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:11.846744    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:11.846747    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:11.848994    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:12.346908    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:12.346932    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:12.346943    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:12.346949    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:12.349967    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:12.846545    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:12.846570    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:12.846582    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:12.846586    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:12.850076    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:12.850171    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:13.345681    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:13.345701    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:13.345708    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:13.345713    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:13.347803    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:13.846672    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:13.846700    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:13.846713    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:13.846719    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:13.850213    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:14.346092    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:14.346108    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:14.346114    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:14.346118    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:14.348283    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:14.846918    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:14.846932    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:14.846938    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:14.846941    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:14.849111    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:15.346636    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:15.346651    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:15.346661    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:15.346691    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:15.348385    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:15.348441    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:15.846720    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:15.846746    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:15.846757    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:15.846800    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:15.850040    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:16.346787    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:16.346802    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:16.346807    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:16.346810    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:16.349402    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:16.846242    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:16.846267    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:16.846279    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:16.846285    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:16.849465    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:17.346142    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:17.346155    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:17.346163    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:17.346166    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:17.350190    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:45:17.350267    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:17.846549    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:17.846563    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:17.846569    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:17.846574    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:17.848738    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:18.346533    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:18.346558    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:18.346628    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:18.346635    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:18.349746    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:18.845790    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:18.845803    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:18.845810    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:18.845813    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:18.852753    4003 round_trippers.go:574] Response Status: 404 Not Found in 6 milliseconds
	I0831 15:45:19.345910    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:19.345921    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:19.345927    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:19.345930    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:19.348239    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:19.846161    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:19.846188    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:19.846205    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:19.846222    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:19.849249    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:19.849335    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:20.347424    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:20.347504    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:20.347518    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:20.347524    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:20.350150    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:20.845819    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:20.845835    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:20.845842    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:20.845845    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:20.848156    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:21.347305    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:21.347322    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:21.347329    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:21.347334    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:21.349936    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:21.846477    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:21.846497    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:21.846509    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:21.846518    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:21.849139    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:22.346802    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:22.346822    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:22.346830    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:22.346842    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:22.348962    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:22.349019    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:22.847375    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:22.847401    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:22.847456    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:22.847466    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:22.850916    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:23.347018    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:23.347030    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:23.347037    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:23.347041    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:23.348873    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:23.846396    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:23.846412    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:23.846418    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:23.846421    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:23.848619    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:24.346563    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:24.346587    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:24.346598    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:24.346605    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:24.349517    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:24.349596    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:24.847762    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:24.847788    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:24.847799    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:24.847807    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:24.850902    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:25.346975    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:25.346987    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:25.346993    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:25.346996    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:25.349147    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:25.846141    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:25.846199    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:25.846211    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:25.846217    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:25.849027    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:26.346014    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:26.346036    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:26.346047    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:26.346053    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:26.349317    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:26.846724    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:26.846739    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:26.846745    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:26.846748    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:26.848768    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:26.848825    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:27.347046    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:27.347061    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:27.347084    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:27.347088    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:27.349358    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:27.847241    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:27.847266    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:27.847278    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:27.847284    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:27.850635    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:28.346098    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:28.346111    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:28.346118    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:28.346120    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:28.348238    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:28.846743    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:28.846769    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:28.846780    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:28.846788    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:28.850051    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:28.850126    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:29.347209    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:29.347223    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:29.347230    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:29.347234    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:29.349262    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:29.847853    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:29.847871    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:29.847899    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:29.847903    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:29.850095    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:30.346592    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:30.346613    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:30.346624    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:30.346630    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:30.349712    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:30.846746    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:30.846772    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:30.846782    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:30.846787    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:30.850071    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:30.850159    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:31.347223    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:31.347268    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:31.347276    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:31.347280    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:31.349187    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:31.846144    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:31.846163    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:31.846180    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:31.846184    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:31.848310    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:32.346217    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:32.346239    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:32.346248    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:32.346254    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:32.348537    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:32.846981    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:32.846996    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:32.847003    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:32.847010    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:32.848991    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:33.346415    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:33.346427    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:33.346433    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:33.346436    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:33.348444    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:33.348503    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:33.845996    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:33.846023    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:33.846066    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:33.846076    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:33.849334    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:34.347376    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:34.347391    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:34.347398    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:34.347401    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:34.349645    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:34.848093    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:34.848113    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:34.848126    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:34.848134    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:34.851450    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:35.346386    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:35.346405    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:35.346416    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:35.346421    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:35.349660    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:35.349728    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:35.846776    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:35.846793    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:35.846799    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:35.846803    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:35.848988    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:36.348020    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:36.348045    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:36.348055    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:36.348061    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:36.351289    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:36.846442    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:36.846466    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:36.846478    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:36.846485    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:36.849727    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:37.346585    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:37.346598    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:37.346604    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:37.346608    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:37.348823    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:37.846395    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:37.846414    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:37.846425    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:37.846431    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:37.849318    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:37.849429    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:38.347018    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:38.347043    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:38.347055    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:38.347059    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:38.350460    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:38.847528    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:38.847544    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:38.847550    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:38.847554    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:38.849461    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:39.346721    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:39.346741    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:39.346752    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:39.346758    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:39.349742    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:39.846123    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:39.846146    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:39.846158    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:39.846164    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:39.849435    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:39.849503    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:40.346540    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:40.346552    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:40.346558    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:40.346560    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:40.348654    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:40.846152    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:40.846173    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:40.846184    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:40.846206    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:40.849347    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:41.346538    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:41.346550    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:41.346556    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:41.346560    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:41.348413    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:41.846620    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:41.846633    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:41.846639    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:41.846642    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:41.848943    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:42.347207    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:42.347233    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:42.347277    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:42.347287    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:42.350122    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:42.350199    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:42.846206    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:42.846231    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:42.846243    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:42.846251    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:42.849366    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:43.346675    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:43.346691    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:43.346724    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:43.346728    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:43.348764    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:43.846267    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:43.846289    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:43.846301    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:43.846306    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:43.849927    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:44.346504    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:44.346524    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:44.346532    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:44.346540    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:44.349860    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:44.847166    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:44.847180    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:44.847186    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:44.847193    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:44.849509    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:44.849569    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:45.347208    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:45.347222    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:45.347229    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:45.347232    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:45.349172    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:45.846510    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:45.846534    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:45.846545    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:45.846551    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:45.849782    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:46.346141    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:46.346158    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:46.346164    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:46.346167    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:46.347845    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:46.848226    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:46.848252    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:46.848263    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:46.848271    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:46.851712    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:46.851793    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:47.346279    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:47.346291    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:47.346297    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:47.346300    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:47.349863    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:47.847020    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:47.847037    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:47.847043    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:47.847046    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:47.848989    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:48.346969    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:48.346995    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:48.347053    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:48.347063    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:48.350507    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:48.847023    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:48.847043    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:48.847054    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:48.847060    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:48.850155    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:49.348069    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:49.348085    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:49.348091    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:49.348097    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:49.350031    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:49.350125    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:49.846786    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:49.846812    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:49.846834    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:49.846844    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:49.850238    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:50.347108    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:50.347128    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:50.347139    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:50.347144    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:50.350196    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:50.846164    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:50.846180    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:50.846186    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:50.846190    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:50.848092    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:51.347436    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:51.347460    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:51.347471    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:51.347477    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:51.351123    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:51.351195    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:51.847405    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:51.847419    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:51.847428    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:51.847433    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:51.849913    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:52.347071    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:52.347083    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:52.347093    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:52.347096    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:52.349220    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:52.847909    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:52.847932    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:52.847943    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:52.847951    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:52.851063    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:53.346206    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:53.346218    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:53.346224    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:53.346228    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:53.348204    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:53.847919    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:53.847935    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:53.847941    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:53.847945    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:53.849907    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:45:53.850011    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:54.347348    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:54.347369    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:54.347380    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:54.347385    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:54.351482    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:45:54.846431    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:54.846455    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:54.846466    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:54.846471    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:54.849557    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:55.348109    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:55.348121    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:55.348128    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:55.348131    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:55.350338    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:55.848148    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:55.848170    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:55.848181    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:55.848200    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:55.851241    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:55.851306    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:56.347660    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:56.347686    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:56.347697    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:56.347703    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:56.351077    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:56.847124    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:56.847140    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:56.847146    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:56.847159    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:56.849236    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:57.347401    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:57.347416    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:57.347444    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:57.347448    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:57.350002    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:57.847762    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:57.847778    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:57.847786    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:57.847792    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:57.849933    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:58.347634    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:58.347646    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:58.347652    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:58.347654    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:58.349839    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:58.349896    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:45:58.846561    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:58.846632    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:58.846645    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:58.846652    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:58.849247    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:45:59.347174    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:59.347196    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:59.347208    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:59.347215    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:59.350401    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:45:59.847088    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:45:59.847103    4003 round_trippers.go:469] Request Headers:
	I0831 15:45:59.847119    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:45:59.847134    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:45:59.849352    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:00.347687    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:00.347714    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:00.347726    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:00.347734    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:00.351744    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:00.351819    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:00.848047    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:00.848068    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:00.848079    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:00.848086    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:00.851749    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:01.347871    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:01.347886    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:01.347895    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:01.347899    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:01.350037    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:01.847381    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:01.847403    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:01.847414    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:01.847423    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:01.850418    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:02.347961    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:02.347983    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:02.347992    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:02.347997    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:02.351704    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:02.351882    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:02.846644    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:02.846656    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:02.846663    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:02.846667    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:02.848618    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:03.346482    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:03.346503    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:03.346515    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:03.346522    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:03.349938    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:03.846526    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:03.846556    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:03.846616    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:03.846639    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:03.850171    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:04.346820    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:04.346836    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:04.346843    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:04.346860    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:04.349066    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:04.846842    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:04.846858    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:04.846868    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:04.846873    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:04.848643    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:04.848700    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:05.348383    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:05.348410    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:05.348423    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:05.348481    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:05.351822    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:05.846904    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:05.846917    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:05.846924    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:05.846927    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:05.848737    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:06.347363    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:06.347388    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:06.347426    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:06.347435    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:06.349807    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:06.846388    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:06.846402    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:06.846411    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:06.846417    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:06.848695    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:06.848754    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:07.346938    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:07.346964    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:07.346991    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:07.347032    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:07.350784    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:07.848381    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:07.848408    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:07.848425    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:07.848433    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:07.851814    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:08.348378    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:08.348403    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:08.348415    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:08.348420    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:08.351770    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:08.846356    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:08.846371    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:08.846377    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:08.846382    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:08.848517    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:09.346659    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:09.346686    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:09.346696    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:09.346705    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:09.349594    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:09.349709    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:09.846024    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:09.846037    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:09.846043    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:09.846047    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:09.847975    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:10.346809    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:10.346834    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:10.346845    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:10.346851    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:10.350256    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:10.844381    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:10.844403    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:10.844415    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:10.844422    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:10.847674    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:11.344377    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:11.344394    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:11.344400    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:11.344403    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:11.346485    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:11.843236    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:11.843247    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:11.843253    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:11.843257    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:11.845363    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:11.845422    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:12.343795    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:12.343813    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:12.343825    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:12.343840    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:12.347319    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:12.844111    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:12.844127    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:12.844133    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:12.844135    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:12.845879    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:13.343860    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:13.343887    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:13.343899    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:13.343904    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:13.347005    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:13.842634    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:13.842656    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:13.842668    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:13.842674    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:13.845855    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:13.845928    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:14.341496    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:14.341511    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:14.341518    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:14.341522    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:14.343436    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:14.841234    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:14.841255    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:14.841265    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:14.841270    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:14.844398    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:15.341763    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:15.341785    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:15.341796    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:15.341802    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:15.345605    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:15.840145    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:15.840161    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:15.840167    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:15.840170    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:15.842412    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:16.339596    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:16.339612    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:16.339621    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:16.339625    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:16.341841    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:16.341895    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:16.840537    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:16.840560    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:16.840580    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:16.840588    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:16.844162    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:17.339830    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:17.339847    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:17.339853    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:17.339862    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:17.341955    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:17.838709    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:17.838734    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:17.838745    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:17.838752    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:17.841971    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:18.339902    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:18.339925    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:18.339936    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:18.339942    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:18.343048    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:18.343121    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:18.837997    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:18.838010    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:18.838017    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:18.838020    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:18.842582    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:46:19.339010    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:19.339088    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:19.339099    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:19.339106    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:19.342495    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:19.839240    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:19.839263    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:19.839274    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:19.839283    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:19.842630    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:20.337822    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:20.337838    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:20.337846    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:20.337852    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:20.339893    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:20.838112    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:20.838140    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:20.838153    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:20.838160    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:20.841535    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:20.841611    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:21.336887    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:21.336902    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:21.336911    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:21.336915    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:21.339247    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:21.837400    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:21.837412    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:21.837416    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:21.837421    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:21.839410    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:22.337957    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:22.337984    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:22.338002    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:22.338016    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:22.341209    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:22.837636    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:22.837662    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:22.837673    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:22.837679    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:22.841366    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:22.841502    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:23.337276    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:23.337291    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:23.337304    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:23.337307    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:23.339521    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:23.836608    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:23.836631    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:23.836644    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:23.836651    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:23.839652    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:24.336381    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:24.336438    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:24.336453    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:24.336461    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:24.339223    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:24.834790    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:24.834810    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:24.834835    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:24.834839    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:24.837005    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:25.335102    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:25.335128    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:25.335139    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:25.335148    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:25.338326    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:25.338462    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:25.835276    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:25.835338    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:25.835361    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:25.835369    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:25.838385    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:26.334552    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:26.334565    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:26.334571    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:26.334574    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:26.336860    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:26.834506    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:26.834518    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:26.834524    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:26.834529    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:26.836177    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:27.334080    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:27.334107    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:27.334118    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:27.334125    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:27.337217    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:27.835003    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:27.835014    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:27.835020    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:27.835023    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:27.837029    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:27.837086    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:28.334519    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:28.334541    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:28.334554    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:28.334561    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:28.338051    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:28.834531    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:28.834552    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:28.834564    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:28.834570    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:28.837555    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:29.333171    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:29.333183    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:29.333190    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:29.333193    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:29.335112    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:29.833314    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:29.833337    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:29.833348    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:29.833354    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:29.836452    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:29.836529    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:30.334371    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:30.334430    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:30.334444    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:30.334452    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:30.337476    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:30.833481    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:30.833496    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:30.833502    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:30.833506    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:30.835694    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:31.333667    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:31.333787    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:31.333806    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:31.333812    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:31.337229    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:31.832937    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:31.832963    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:31.832976    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:31.832982    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:31.836197    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:31.836277    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:32.334027    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:32.334042    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:32.334049    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:32.334052    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:32.336000    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:32.832302    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:32.832329    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:32.832340    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:32.832349    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:32.835491    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:33.332732    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:33.332754    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:33.332765    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:33.332774    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:33.336007    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:33.832656    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:33.832672    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:33.832678    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:33.832681    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:33.836925    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:46:33.836986    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:34.332711    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:34.332735    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:34.332744    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:34.332748    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:34.336280    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:34.832778    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:34.832803    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:34.832815    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:34.832821    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:34.836052    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:35.331831    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:35.331847    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:35.331853    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:35.331855    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:35.333909    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:35.833174    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:35.833199    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:35.833210    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:35.833217    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:35.836522    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:35.836602    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:36.331760    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:36.331785    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:36.331797    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:36.331808    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:36.335187    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:36.831430    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:36.831443    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:36.831449    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:36.831452    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:36.833390    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:37.332076    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:37.332102    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:37.332113    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:37.332120    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:37.337064    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:46:37.831843    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:37.831865    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:37.831875    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:37.831882    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:37.834817    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:38.330953    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:38.330969    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:38.330996    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:38.331001    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:38.332836    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:38.332899    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:38.831091    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:38.831111    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:38.831134    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:38.831141    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:38.834085    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:39.330988    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:39.331010    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:39.331021    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:39.331030    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:39.334198    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:39.830708    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:39.830722    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:39.830728    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:39.830731    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:39.833084    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:40.331955    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:40.331978    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:40.331988    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:40.331995    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:40.335663    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:40.335827    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:40.831715    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:40.831736    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:40.831748    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:40.831753    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:40.834480    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:41.331801    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:41.331816    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:41.331824    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:41.331828    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:41.333947    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:41.830652    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:41.830674    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:41.830686    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:41.830692    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:41.833807    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:42.330632    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:42.330682    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:42.330694    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:42.330701    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:42.333713    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:42.830375    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:42.830390    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:42.830397    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:42.830400    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:42.832629    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:42.832686    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:43.330682    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:43.330708    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:43.330719    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:43.330725    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:43.333898    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:43.831092    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:43.831113    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:43.831125    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:43.831132    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:43.834043    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:44.331020    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:44.331035    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:44.331041    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:44.331044    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:44.333218    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:44.830357    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:44.830379    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:44.830390    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:44.830397    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:44.833640    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:44.833710    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:45.331564    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:45.331586    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:45.331598    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:45.331602    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:45.334717    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:45.830842    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:45.830857    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:45.830864    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:45.830868    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:45.832745    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:46.330292    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:46.330318    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:46.330330    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:46.330346    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:46.333844    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:46.830138    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:46.830164    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:46.830175    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:46.830183    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:46.833916    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:46.833987    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:47.330364    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:47.330380    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:47.330386    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:47.330389    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:47.332650    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:47.830666    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:47.830689    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:47.830701    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:47.830710    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:47.833714    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:48.330763    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:48.330784    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:48.330796    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:48.330804    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:48.334071    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:48.831187    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:48.831203    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:48.831209    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:48.831212    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:48.833347    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:49.330476    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:49.330500    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:49.330511    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:49.330517    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:49.333785    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:49.333851    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:49.831216    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:49.831242    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:49.831252    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:49.831272    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:49.834540    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:50.329535    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:50.329548    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:50.329554    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:50.329557    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:50.331810    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:50.829989    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:50.830011    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:50.830022    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:50.830030    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:50.833229    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:51.329962    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:51.329982    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:51.329998    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:51.330005    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:51.333236    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:51.830064    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:51.830077    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:51.830084    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:51.830087    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:51.832177    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:51.832239    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:52.330485    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:52.330510    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:52.330522    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:52.330528    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:52.334017    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:52.830400    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:52.830425    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:52.830436    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:52.830442    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:52.833770    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:53.329566    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:53.329579    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:53.329585    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:53.329589    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:53.331657    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:53.831320    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:53.831345    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:53.831357    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:53.831367    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:53.834615    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:53.834695    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:54.330494    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:54.330520    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:54.330537    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:54.330543    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:54.333826    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:54.830758    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:54.830774    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:54.830780    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:54.830783    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:54.832979    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:55.330573    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:55.330607    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:55.330642    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:55.330652    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:55.334018    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:55.830009    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:55.830030    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:55.830039    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:55.830045    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:55.833311    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:56.329121    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:56.329135    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:56.329150    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:56.329154    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:56.331151    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:56.331267    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:56.829636    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:56.829658    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:56.829676    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:56.829683    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:56.832997    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:46:57.330100    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:57.330164    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:57.330179    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:57.330185    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:57.332967    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:57.830447    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:57.830460    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:57.830466    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:57.830473    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:57.832494    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:58.330373    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:58.330394    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:58.330406    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:58.330411    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:58.333052    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:58.333119    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:46:58.829621    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:58.829634    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:58.829640    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:58.829644    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:58.832012    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:46:59.329472    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:59.329486    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:59.329493    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:59.329497    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:59.331476    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:46:59.828991    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:46:59.829004    4003 round_trippers.go:469] Request Headers:
	I0831 15:46:59.829010    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:46:59.829013    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:46:59.832279    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:00.329603    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:00.329622    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:00.329633    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:00.329639    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:00.332733    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:00.830103    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:00.830116    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:00.830122    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:00.830125    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:00.837585    4003 round_trippers.go:574] Response Status: 404 Not Found in 7 milliseconds
	I0831 15:47:00.837647    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:01.330092    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:01.330112    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:01.330124    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:01.330132    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:01.333438    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:01.830117    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:01.830142    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:01.830152    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:01.830156    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:01.833106    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:02.330382    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:02.330398    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:02.330411    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:02.330415    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:02.332370    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:02.829065    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:02.829088    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:02.829101    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:02.829108    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:02.831924    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:03.330094    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:03.330120    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:03.330131    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:03.330136    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:03.333449    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:03.333526    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:03.830291    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:03.830308    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:03.830314    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:03.830317    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:03.832083    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:04.330231    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:04.330254    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:04.330289    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:04.330297    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:04.332924    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:04.829724    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:04.829747    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:04.829759    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:04.829767    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:04.833424    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:05.329231    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:05.329246    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:05.329253    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:05.329255    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:05.331317    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:05.828856    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:05.828876    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:05.828887    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:05.828893    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:05.831350    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:05.831420    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:06.329491    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:06.329514    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:06.329526    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:06.329535    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:06.332911    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:06.830113    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:06.830137    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:06.830167    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:06.830171    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:06.832311    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:07.328832    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:07.328852    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:07.328865    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:07.328872    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:07.331707    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:07.830142    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:07.830169    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:07.830210    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:07.830218    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:07.833304    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:07.833425    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:08.330192    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:08.330204    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:08.330211    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:08.330215    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:08.332216    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:08.829708    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:08.829721    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:08.829728    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:08.829731    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:08.832016    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:09.329901    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:09.329921    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:09.329934    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:09.329939    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:09.332962    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:09.829856    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:09.829869    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:09.829876    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:09.829879    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:09.831857    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:10.329372    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:10.329432    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:10.329446    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:10.329452    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:10.332160    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:10.332227    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:10.829229    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:10.829253    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:10.829265    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:10.829272    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:10.833374    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:47:11.330031    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:11.330047    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:11.330053    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:11.330057    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:11.332038    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:11.829331    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:11.829357    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:11.829372    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:11.829379    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:11.832648    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:12.329974    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:12.329989    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:12.329996    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:12.329999    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:12.332071    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:12.830134    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:12.830150    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:12.830156    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:12.830161    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:12.832336    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:12.832389    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:13.329321    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:13.329343    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:13.329353    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:13.329359    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:13.332441    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:13.828908    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:13.828931    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:13.828943    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:13.828950    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:13.832731    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:14.329733    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:14.329748    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:14.329755    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:14.329758    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:14.331982    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:14.830417    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:14.830445    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:14.830486    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:14.830493    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:14.833911    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:14.833990    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:15.328769    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:15.328790    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:15.328802    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:15.328812    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:15.331836    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:15.829268    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:15.829280    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:15.829286    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:15.829290    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:15.831233    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:16.329720    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:16.329739    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:16.329750    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:16.329758    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:16.332304    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:16.829209    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:16.829226    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:16.829234    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:16.829237    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:16.831627    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:17.330054    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:17.330070    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:17.330076    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:17.330079    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:17.332072    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:17.332162    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:17.829699    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:17.829721    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:17.829733    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:17.829738    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:17.833375    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:18.329515    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:18.329535    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:18.329546    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:18.329552    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:18.332114    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:18.829215    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:18.829228    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:18.829234    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:18.829237    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:18.831755    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:19.329707    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:19.329721    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:19.329728    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:19.329733    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:19.331565    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:19.830156    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:19.830177    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:19.830189    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:19.830198    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:19.833385    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:19.833450    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:20.328992    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:20.329004    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:20.329010    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:20.329014    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:20.331474    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:20.829297    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:20.829321    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:20.829332    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:20.829342    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:20.832512    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:21.329420    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:21.329442    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:21.329454    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:21.329460    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:21.332977    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:21.830340    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:21.830375    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:21.830384    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:21.830389    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:21.832344    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:22.330124    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:22.330146    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:22.330157    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:22.330164    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:22.332847    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:22.332923    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:22.829382    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:22.829408    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:22.829452    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:22.829461    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:22.832159    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:23.329407    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:23.329422    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:23.329429    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:23.329432    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:23.331410    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:23.829613    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:23.829636    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:23.829648    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:23.829654    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:23.832995    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:24.328868    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:24.328900    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:24.328966    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:24.328977    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:24.331905    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:24.829531    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:24.829552    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:24.829562    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:24.829567    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:24.832215    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:24.832290    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:25.329491    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:25.329512    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:25.329523    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:25.329531    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:25.332387    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:25.829129    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:25.829150    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:25.829161    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:25.829170    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:25.831914    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:26.329975    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:26.329998    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:26.330010    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:26.330016    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:26.332377    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:26.828755    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:26.828780    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:26.828794    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:26.828801    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:26.832060    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:27.328656    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:27.328678    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:27.328686    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:27.328696    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:27.332476    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:27.332537    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:27.829167    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:27.829178    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:27.829184    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:27.829187    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:27.830801    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:28.329337    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:28.329357    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:28.329368    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:28.329374    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:28.331877    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:28.828686    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:28.828709    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:28.828719    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:28.828725    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:28.831646    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:29.328641    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:29.328656    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:29.328666    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:29.328670    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:29.330609    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:29.828963    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:29.828978    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:29.828984    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:29.828988    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:29.831283    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:29.831335    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:30.329840    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:30.329859    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:30.329870    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:30.329876    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:30.332855    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:30.830461    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:30.830506    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:30.830516    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:30.830521    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:30.832580    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:31.330097    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:31.330110    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:31.330117    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:31.330120    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:31.331769    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:31.828676    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:31.828694    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:31.828706    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:31.828712    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:31.831715    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:31.831786    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:32.328645    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:32.328695    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:32.328704    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:32.328711    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:32.330855    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:32.830681    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:32.830701    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:32.830711    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:32.830717    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:32.833687    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:33.330045    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:33.330067    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:33.330080    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:33.330088    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:33.333035    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:33.829438    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:33.829470    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:33.829481    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:33.829486    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:33.832104    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:33.832209    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:34.329654    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:34.329677    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:34.329691    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:34.329700    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:34.332562    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:34.828622    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:34.828641    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:34.828652    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:34.828657    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:34.831130    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:35.328804    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:35.328825    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:35.328836    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:35.328843    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:35.331419    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:35.829296    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:35.829317    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:35.829329    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:35.829336    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:35.832744    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:35.832822    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:36.328855    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:36.328879    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:36.328890    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:36.328896    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:36.331894    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:36.828612    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:36.828632    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:36.828644    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:36.828650    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:36.831201    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:37.329040    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:37.329061    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:37.329076    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:37.329082    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:37.332359    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:37.828655    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:37.828667    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:37.828673    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:37.828676    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:37.830554    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:38.328877    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:38.328890    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:38.328896    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:38.328900    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:38.330918    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:38.330989    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:38.828965    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:38.828995    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:38.829051    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:38.829058    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:38.832125    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:39.329128    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:39.329186    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:39.329201    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:39.329209    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:39.332056    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:39.829820    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:39.829836    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:39.829842    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:39.829846    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:39.832218    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:40.328814    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:40.328863    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:40.328877    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:40.328883    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:40.331799    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:40.331871    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:40.829864    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:40.829888    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:40.829904    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:40.829911    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:40.832995    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:41.329765    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:41.329783    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:41.329792    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:41.329797    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:41.332227    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:41.830043    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:41.830062    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:41.830073    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:41.830079    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:41.833230    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:42.330723    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:42.330747    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:42.330787    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:42.330795    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:42.333941    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:42.334034    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:42.829938    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:42.829951    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:42.829957    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:42.829960    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:42.831505    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:43.329861    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:43.329884    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:43.329897    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:43.329903    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:43.333660    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:43.829116    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:43.829142    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:43.829154    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:43.829159    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:43.832310    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:44.330318    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:44.330336    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:44.330344    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:44.330350    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:44.332716    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:44.829324    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:44.829351    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:44.829363    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:44.829368    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:44.832857    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:44.832967    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:45.329358    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:45.329380    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:45.329391    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:45.329399    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:45.332784    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:45.829653    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:45.829668    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:45.829675    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:45.829679    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:45.831809    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:46.328769    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:46.328794    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:46.328807    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:46.328812    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:46.331758    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:46.829308    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:46.829333    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:46.829345    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:46.829350    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:46.832699    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:47.330622    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:47.330649    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:47.330701    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:47.330710    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:47.333673    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:47.333746    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:47.829700    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:47.829724    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:47.829735    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:47.829739    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:47.832430    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:48.329697    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:48.329719    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:48.329730    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:48.329739    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:48.333104    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:48.828962    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:48.828977    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:48.828986    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:48.828990    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:48.831389    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:49.329842    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:49.329867    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:49.329876    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:49.329883    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:49.332642    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:49.830281    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:49.830308    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:49.830319    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:49.830327    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:49.833684    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:49.833787    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:50.329816    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:50.329831    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:50.329837    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:50.329842    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:50.331764    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:50.829053    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:50.829076    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:50.829088    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:50.829095    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:50.832256    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:51.330225    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:51.330255    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:51.330270    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:51.330281    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:51.333711    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:51.829842    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:51.829861    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:51.829872    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:51.829878    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:51.832473    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:52.329568    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:52.329595    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:52.329606    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:52.329618    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:52.332450    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:52.332569    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:52.829778    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:52.829805    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:52.829816    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:52.829822    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:52.833363    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:53.329291    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:53.329306    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:53.329313    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:53.329317    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:53.331172    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:53.830270    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:53.830295    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:53.830306    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:53.830314    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:53.833837    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:54.330125    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:54.330151    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:54.330162    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:54.330168    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:54.333461    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:54.333541    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:54.829293    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:54.829321    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:54.829334    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:54.829341    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:54.832226    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:55.330712    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:55.330738    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:55.330749    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:55.330757    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:55.334141    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:55.828817    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:55.828872    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:55.828887    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:55.828895    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:55.831682    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:56.329544    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:56.329568    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:56.329580    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:56.329588    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:56.332148    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:47:56.830699    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:56.830725    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:56.830736    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:56.830743    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:56.834490    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:56.834565    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:57.329839    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:57.329861    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:57.329873    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:57.329878    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:57.333090    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:57.828829    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:57.828910    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:57.828916    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:57.828920    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:57.830711    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:47:58.328896    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:58.328923    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:58.328934    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:58.328940    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:58.332463    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:58.828817    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:58.828842    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:58.828854    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:58.828862    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:58.832243    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:59.330153    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:59.330177    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:59.330188    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:59.330193    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:59.333357    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:47:59.333454    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:47:59.830783    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:47:59.830807    4003 round_trippers.go:469] Request Headers:
	I0831 15:47:59.830818    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:47:59.830876    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:47:59.834206    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:00.329131    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:00.329150    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:00.329159    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:00.329164    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:00.331350    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:00.830148    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:00.830172    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:00.830238    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:00.830248    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:00.832938    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:01.330744    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:01.330765    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:01.330776    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:01.330781    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:01.334219    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:01.334299    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:01.828849    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:01.828871    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:01.828882    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:01.828890    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:01.832151    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:02.329416    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:02.329435    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:02.329444    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:02.329448    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:02.332568    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:02.829933    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:02.829960    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:02.830044    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:02.830051    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:02.833123    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:03.328950    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:03.328972    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:03.328981    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:03.328989    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:03.331913    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:03.829379    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:03.829445    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:03.829462    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:03.829469    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:03.832420    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:03.832488    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:04.330783    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:04.330808    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:04.330819    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:04.330825    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:04.333835    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:04.828809    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:04.828835    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:04.828844    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:04.828852    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:04.832228    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:05.330083    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:05.330103    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:05.330115    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:05.330122    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:05.333301    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:05.829216    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:05.829239    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:05.829250    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:05.829257    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:05.832698    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:05.832773    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:06.329078    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:06.329103    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:06.329116    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:06.329123    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:06.332045    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:06.830238    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:06.830261    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:06.830306    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:06.830316    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:06.833538    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:07.330777    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:07.330798    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:07.330808    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:07.330815    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:07.334065    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:07.829264    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:07.829288    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:07.829330    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:07.829338    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:07.832368    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:08.329114    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:08.329178    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:08.329193    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:08.329211    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:08.332086    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:08.332156    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:08.829364    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:08.829385    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:08.829397    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:08.829404    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:08.832446    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:09.328860    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:09.328872    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:09.328878    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:09.328881    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:09.331153    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:09.829450    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:09.829472    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:09.829482    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:09.829490    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:09.839325    4003 round_trippers.go:574] Response Status: 404 Not Found in 9 milliseconds
	I0831 15:48:10.329202    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:10.329227    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:10.329290    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:10.329300    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:10.336072    4003 round_trippers.go:574] Response Status: 404 Not Found in 6 milliseconds
	I0831 15:48:10.336141    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:10.829298    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:10.829320    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:10.829333    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:10.829339    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:10.832656    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:11.328862    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:11.328879    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:11.328886    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:11.328890    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:11.331251    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:11.828789    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:11.828814    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:11.828825    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:11.828830    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:11.831875    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:12.329621    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:12.329641    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:12.329652    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:12.329657    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:12.332812    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:12.829177    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:12.829198    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:12.829209    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:12.829215    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:12.832205    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:12.832271    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:13.329690    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:13.329709    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:13.329721    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:13.329726    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:13.332350    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:13.830163    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:13.830187    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:13.830200    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:13.830207    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:13.833785    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:14.330813    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:14.330871    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:14.330889    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:14.330897    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:14.333729    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:14.829241    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:14.829256    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:14.829265    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:14.829271    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:14.831656    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:15.329102    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:15.329117    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:15.329125    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:15.329128    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:15.331035    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:48:15.331094    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:15.829453    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:15.829477    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:15.829490    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:15.829498    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:15.832921    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:16.330482    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:16.330501    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:16.330512    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:16.330519    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:16.333392    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:16.829494    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:16.829514    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:16.829526    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:16.829531    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:16.832666    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:17.328819    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:17.328832    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:17.328838    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:17.328842    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:17.332907    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:48:17.333002    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:17.830033    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:17.830052    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:17.830063    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:17.830071    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:17.833459    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:18.330056    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:18.330077    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:18.330089    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:18.330094    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:18.333155    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:18.830388    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:18.830402    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:18.830408    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:18.830411    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:18.832447    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:19.329634    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:19.329659    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:19.329671    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:19.329677    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:19.333012    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:19.333085    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:19.829599    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:19.829619    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:19.829631    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:19.829639    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:19.833057    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:20.330129    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:20.330145    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:20.330151    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:20.330154    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:20.331920    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:48:20.829042    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:20.829056    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:20.829065    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:20.829069    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:20.831640    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:21.330321    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:21.330345    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:21.330357    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:21.330364    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:21.333593    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:21.333742    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:21.829489    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:21.829509    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:21.829521    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:21.829528    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:21.832949    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:22.329074    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:22.329097    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:22.329109    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:22.329115    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:22.332552    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:22.829496    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:22.829514    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:22.829523    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:22.829528    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:22.831769    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:23.329638    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:23.329654    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:23.329662    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:23.329666    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:23.332063    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:23.830053    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:23.830067    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:23.830105    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:23.830115    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:23.832192    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:23.832251    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:24.329240    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:24.329260    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:24.329272    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:24.329277    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:24.332009    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:24.830470    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:24.830482    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:24.830488    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:24.830491    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:24.835168    4003 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0831 15:48:25.330931    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:25.330957    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:25.330968    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:25.330974    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:25.334396    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:25.830021    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:25.830047    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:25.830057    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:25.830063    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:25.833612    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:25.833684    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:26.330695    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:26.330715    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:26.330726    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:26.330733    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:26.333858    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:26.829799    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:26.829824    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:26.829833    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:26.829838    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:26.833084    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:27.329417    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:27.329439    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:27.329450    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:27.329457    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:27.333005    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:27.829654    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:27.829674    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:27.829685    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:27.829693    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:27.832427    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:28.329524    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:28.329539    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:28.329580    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:28.329585    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:28.331632    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:28.331748    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:28.829893    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:28.829913    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:28.829925    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:28.829932    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:28.833039    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:29.329166    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:29.329185    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:29.329193    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:29.329197    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:29.331783    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:29.829024    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:29.829051    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:29.829062    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:29.829070    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:29.832264    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:30.328905    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:30.328931    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:30.328942    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:30.328947    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:30.332052    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:30.332123    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:30.830052    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:30.830072    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:30.830082    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:30.830091    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:30.833325    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:31.330324    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:31.330348    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:31.330360    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:31.330365    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:31.333570    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:31.830355    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:31.830379    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:31.830391    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:31.830448    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:31.833417    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:32.330044    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:32.330081    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:32.330090    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:32.330097    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:32.332188    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:32.332242    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:32.828972    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:32.828987    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:32.828994    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:32.828997    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:32.830746    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:48:33.330302    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:33.330324    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:33.330335    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:33.330342    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:33.333187    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:33.828871    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:33.828885    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:33.828891    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:33.828894    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:33.830679    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:48:34.329246    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:34.329269    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:34.329284    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:34.329293    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:34.332379    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:34.332447    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:34.828888    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:34.828903    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:34.828936    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:34.828941    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:34.836178    4003 round_trippers.go:574] Response Status: 404 Not Found in 7 milliseconds
	I0831 15:48:35.330611    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:35.330647    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:35.330655    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:35.330658    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:35.333046    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:35.829308    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:35.829333    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:35.829344    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:35.829352    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:35.832682    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:36.329920    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:36.329937    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:36.329976    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:36.329982    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:36.332428    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:36.332513    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:36.830494    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:36.830509    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:36.830515    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:36.830550    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:36.832561    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:37.329913    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:37.329933    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:37.329944    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:37.329949    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:37.332838    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:37.829024    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:37.829050    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:37.829062    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:37.829069    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:37.832669    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:38.330684    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:38.330699    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:38.330705    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:38.330708    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:38.332762    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:38.332823    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:38.829400    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:38.829426    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:38.829444    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:38.829450    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:38.832697    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:39.330303    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:39.330331    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:39.330342    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:39.330348    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:39.333360    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:39.829748    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:39.829768    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:39.829777    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:39.829781    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:39.832089    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:40.328868    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:40.328892    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:40.328903    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:40.328908    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:40.331956    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:40.829153    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:40.829180    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:40.829192    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:40.829199    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:40.832739    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:40.832818    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:41.330714    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:41.330729    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:41.330735    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:41.330738    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:41.332850    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:41.829181    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:41.829207    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:41.829217    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:41.829225    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:41.832653    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:42.330611    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:42.330634    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:42.330646    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:42.330655    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:42.334145    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:42.830611    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:42.830650    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:42.830658    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:42.830662    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:42.832630    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:48:43.329836    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:43.329858    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:43.329870    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:43.329877    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:43.333122    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:43.333193    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:43.829159    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:43.829183    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:43.829194    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:43.829200    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:43.832264    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:44.330509    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:44.330524    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:44.330531    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:44.330537    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:44.332882    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:44.829657    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:44.829680    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:44.829695    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:44.829700    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:44.832675    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:45.329175    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:45.329200    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:45.329211    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:45.329217    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:45.332400    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:45.829172    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:45.829184    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:45.829191    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:45.829194    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:45.831511    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:45.831573    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:46.329275    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:46.329302    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:46.329312    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:46.329318    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:46.332403    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:46.829488    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:46.829509    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:46.829521    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:46.829527    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:46.832727    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:47.329181    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:47.329197    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:47.329202    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:47.329205    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:47.331729    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:47.829140    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:47.829163    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:47.829175    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:47.829182    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:47.832590    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:47.832666    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:48.329582    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:48.329624    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:48.329632    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:48.329639    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:48.332262    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:48.829927    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:48.829940    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:48.829948    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:48.829951    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:48.832095    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:49.329030    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:49.329054    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:49.329067    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:49.329073    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:49.331713    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:49.829998    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:49.830024    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:49.830035    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:49.830042    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:49.833387    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:49.833457    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:50.329328    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:50.329345    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:50.329351    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:50.329355    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:50.331789    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:50.829290    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:50.829312    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:50.829323    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:50.829327    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:50.832450    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:51.329373    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:51.329396    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:51.329407    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:51.329413    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:51.332584    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:51.828974    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:51.828993    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:51.828999    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:51.829002    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:51.831143    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:52.329568    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:52.329582    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:52.329588    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:52.329591    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:52.331474    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:48:52.331532    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:52.828983    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:52.829009    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:52.829020    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:52.829027    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:52.831923    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:53.330254    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:53.330266    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:53.330272    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:53.330275    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:53.332376    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:53.829955    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:53.829977    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:53.829986    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:53.829991    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:53.833487    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:54.330025    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:54.330048    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:54.330058    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:54.330064    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:54.332846    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:54.332916    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:54.829445    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:54.829461    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:54.829469    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:54.829473    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:54.831681    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:55.330304    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:55.330329    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:55.330339    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:55.330343    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:55.333464    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:55.829335    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:55.829357    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:55.829372    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:55.829379    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:55.832747    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:56.329562    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:56.329574    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:56.329580    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:56.329583    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:56.331549    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:48:56.830534    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:56.830555    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:56.830566    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:56.830571    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:56.834033    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:56.834111    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:57.329183    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:57.329210    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:57.329220    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:57.329229    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:57.332084    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:57.829250    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:57.829263    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:57.829269    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:57.829273    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:57.831424    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:58.329091    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:58.329116    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:58.329126    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:58.329131    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:58.331697    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:58.831142    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:58.831167    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:58.831178    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:58.831185    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:58.834492    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:48:58.834557    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:48:59.329237    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:59.329252    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:59.329257    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:59.329261    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:59.331512    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:48:59.829320    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:48:59.829342    4003 round_trippers.go:469] Request Headers:
	I0831 15:48:59.829353    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:48:59.829359    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:48:59.832197    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:00.330432    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:00.330451    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:00.330462    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:00.330471    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:00.333067    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:00.829237    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:00.829253    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:00.829260    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:00.829263    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:00.831418    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:01.329358    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:01.329379    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:01.329388    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:01.329393    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:01.332371    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:01.332438    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:49:01.830578    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:01.830604    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:01.830617    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:01.830623    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:01.834159    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:02.329157    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:02.329173    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:02.329179    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:02.329182    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:02.331067    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:49:02.831085    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:02.831112    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:02.831123    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:02.831130    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:02.834437    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:03.331085    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:03.331109    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:03.331152    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:03.331159    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:03.334347    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:03.334422    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:49:03.829836    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:03.829853    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:03.829859    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:03.829863    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:03.831902    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:04.331065    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:04.331089    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:04.331100    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:04.331107    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:04.334167    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:04.831234    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:04.831261    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:04.831273    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:04.831279    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:04.834602    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:05.330136    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:05.330151    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:05.330157    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:05.330160    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:05.332374    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:05.830128    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:05.830150    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:05.830165    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:05.830171    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:05.834152    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:05.834213    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:49:06.329879    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:06.329904    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:06.329915    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:06.329924    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:06.332822    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:06.829369    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:06.829385    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:06.829390    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:06.829393    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:06.831713    4003 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0831 15:49:07.329339    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:07.329361    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:07.329373    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:07.329380    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:07.332647    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:07.830352    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:07.830380    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:07.830437    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:07.830448    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:07.833556    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:08.329058    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:08.329073    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:08.329079    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:08.329082    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:08.331089    4003 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0831 15:49:08.331148    4003 node_ready.go:53] error getting node "ha-949000-m04": nodes "ha-949000-m04" not found
	I0831 15:49:08.830337    4003 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-949000-m04
	I0831 15:49:08.830361    4003 round_trippers.go:469] Request Headers:
	I0831 15:49:08.830372    4003 round_trippers.go:473]     Accept: application/json, */*
	I0831 15:49:08.830379    4003 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 15:49:08.833728    4003 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0831 15:49:08.833817    4003 node_ready.go:38] duration metric: took 4m0.004911985s for node "ha-949000-m04" to be "Ready" ...
	I0831 15:49:08.856471    4003 out.go:201] 
	W0831 15:49:08.878133    4003 out.go:270] X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	W0831 15:49:08.878147    4003 out.go:270] * 
	W0831 15:49:08.878920    4003 out.go:293] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0831 15:49:08.943376    4003 out.go:201] 
	
	
	==> Docker <==
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.332263033Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.370445559Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.370708492Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.370824991Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.371374304Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.371365499Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.371690677Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.371839495Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.372326970Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.374135025Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.379001438Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.379117671Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.381398964Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.411323783Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.411385669Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.411398736Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:44:39 ha-949000 dockerd[1161]: time="2024-08-31T22:44:39.411510078Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:45:09 ha-949000 dockerd[1154]: time="2024-08-31T22:45:09.824046002Z" level=info msg="ignoring event" container=216b25e04efdd68fa78ff1cfc79456f27ab236602c5e05f800a59fa3cb220480 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Aug 31 22:45:09 ha-949000 dockerd[1161]: time="2024-08-31T22:45:09.824322056Z" level=info msg="shim disconnected" id=216b25e04efdd68fa78ff1cfc79456f27ab236602c5e05f800a59fa3cb220480 namespace=moby
	Aug 31 22:45:09 ha-949000 dockerd[1161]: time="2024-08-31T22:45:09.824375729Z" level=warning msg="cleaning up after shim disconnected" id=216b25e04efdd68fa78ff1cfc79456f27ab236602c5e05f800a59fa3cb220480 namespace=moby
	Aug 31 22:45:09 ha-949000 dockerd[1161]: time="2024-08-31T22:45:09.824925130Z" level=info msg="cleaning up dead shim" namespace=moby
	Aug 31 22:45:23 ha-949000 dockerd[1161]: time="2024-08-31T22:45:23.385665751Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 22:45:23 ha-949000 dockerd[1161]: time="2024-08-31T22:45:23.385739452Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 22:45:23 ha-949000 dockerd[1161]: time="2024-08-31T22:45:23.385752198Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 22:45:23 ha-949000 dockerd[1161]: time="2024-08-31T22:45:23.385842000Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	11a121a84e236       6e38f40d628db       5 minutes ago       Running             storage-provisioner       4                   675a87e7bbf1d       storage-provisioner
	18fa81194c803       8c811b4aec35f       5 minutes ago       Running             busybox                   2                   a1fb1144f3287       busybox-7dff88458-5kkbw
	5b45844943a70       cbb01a7bd410d       5 minutes ago       Running             coredns                   2                   4ab6f492ffa53       coredns-6f6b679f8f-snq8s
	39caece4a1a06       12968670680f4       5 minutes ago       Running             kindnet-cni               2                   84921ed532424       kindnet-jzj42
	216b25e04efdd       6e38f40d628db       5 minutes ago       Exited              storage-provisioner       3                   675a87e7bbf1d       storage-provisioner
	92325d0ba5d32       cbb01a7bd410d       5 minutes ago       Running             coredns                   2                   9a17b13011ad6       coredns-6f6b679f8f-kjszm
	ce00ce382bb0c       ad83b2ca7b09e       5 minutes ago       Running             kube-proxy                2                   563c95c71d5ae       kube-proxy-q7ndn
	ca5e9a101fac2       045733566833c       6 minutes ago       Running             kube-controller-manager   4                   0976cb0a1281b       kube-controller-manager-ha-949000
	8be9164123bc9       604f5db92eaa8       6 minutes ago       Running             kube-apiserver            3                   e0447c649afe4       kube-apiserver-ha-949000
	6c320a1f78aee       1766f54c897f0       7 minutes ago       Running             kube-scheduler            2                   515614d004b25       kube-scheduler-ha-949000
	c016f5fcb7d72       2e96e5913fc06       7 minutes ago       Running             etcd                      2                   716e9fa824b03       etcd-ha-949000
	981e8e790a392       045733566833c       7 minutes ago       Exited              kube-controller-manager   3                   0976cb0a1281b       kube-controller-manager-ha-949000
	23e342681c007       38af8ddebf499       7 minutes ago       Running             kube-vip                  1                   87b3e236006c5       kube-vip-ha-949000
	6966a01f96234       604f5db92eaa8       7 minutes ago       Exited              kube-apiserver            2                   e0447c649afe4       kube-apiserver-ha-949000
	f5deb862745e4       8c811b4aec35f       13 minutes ago      Exited              busybox                   1                   88b8aff8a006d       busybox-7dff88458-5kkbw
	f89b862064139       ad83b2ca7b09e       13 minutes ago      Exited              kube-proxy                1                   eb9132907eda4       kube-proxy-q7ndn
	ac487ac32c364       cbb01a7bd410d       13 minutes ago      Exited              coredns                   1                   b2a8128cbfc29       coredns-6f6b679f8f-snq8s
	ff98d7e38a1e6       12968670680f4       13 minutes ago      Exited              kindnet-cni               1                   fc1aa95e54f86       kindnet-jzj42
	c4dc6059b2150       cbb01a7bd410d       13 minutes ago      Exited              coredns                   1                   9b710526ef4f9       coredns-6f6b679f8f-kjszm
	5b0ac6b7faf7d       1766f54c897f0       13 minutes ago      Exited              kube-scheduler            1                   6e330e66cf27f       kube-scheduler-ha-949000
	2255978551ea3       2e96e5913fc06       13 minutes ago      Exited              etcd                      1                   d62930734f2f9       etcd-ha-949000
	0bb147eb5f408       38af8ddebf499       13 minutes ago      Exited              kube-vip                  0                   9ac139ab4844d       kube-vip-ha-949000
	
	
	==> coredns [5b45844943a7] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 257e111468ef6f1e36f10df061303186c353cd0e51aed8f50f4e4fd21cec02687aef97084fe1f82262f5cee88179d311670a6ae21ae185759728216fc264125f
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:47900 - 36884 "HINFO IN 2333551711870933102.2340796284351020766. udp 57 false 512" NXDOMAIN qr,rd,ra 132 0.008323198s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1041136774]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:44:39.776) (total time: 30000ms):
	Trace[1041136774]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (22:45:09.776)
	Trace[1041136774]: [30.000488845s] [30.000488845s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[2116759242]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:44:39.776) (total time: 30000ms):
	Trace[2116759242]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (22:45:09.777)
	Trace[2116759242]: [30.00030351s] [30.00030351s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[693026538]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:44:39.777) (total time: 30000ms):
	Trace[693026538]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (22:45:09.777)
	Trace[693026538]: [30.000248071s] [30.000248071s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [92325d0ba5d3] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 257e111468ef6f1e36f10df061303186c353cd0e51aed8f50f4e4fd21cec02687aef97084fe1f82262f5cee88179d311670a6ae21ae185759728216fc264125f
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:37396 - 48689 "HINFO IN 9162885205725873992.3311076006694622340. udp 57 false 512" NXDOMAIN qr,rd,ra 132 0.008859861s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[180755621]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:44:39.768) (total time: 30001ms):
	Trace[180755621]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (22:45:09.769)
	Trace[180755621]: [30.001189401s] [30.001189401s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1144270708]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:44:39.770) (total time: 30001ms):
	Trace[1144270708]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (22:45:09.772)
	Trace[1144270708]: [30.001530888s] [30.001530888s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[735366369]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:44:39.772) (total time: 30000ms):
	Trace[735366369]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (22:45:09.773)
	Trace[735366369]: [30.000672378s] [30.000672378s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [ac487ac32c36] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 257e111468ef6f1e36f10df061303186c353cd0e51aed8f50f4e4fd21cec02687aef97084fe1f82262f5cee88179d311670a6ae21ae185759728216fc264125f
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:37668 - 17883 "HINFO IN 4931414995021238036.4254872758042696539. udp 57 false 512" NXDOMAIN qr,rd,ra 132 0.026863898s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1645472327]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.837) (total time: 30003ms):
	Trace[1645472327]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30002ms (22:37:45.839)
	Trace[1645472327]: [30.003429832s] [30.003429832s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[2054948566]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.838) (total time: 30003ms):
	Trace[2054948566]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30003ms (22:37:45.841)
	Trace[2054948566]: [30.003549662s] [30.003549662s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[850581595]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.840) (total time: 30001ms):
	Trace[850581595]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (22:37:45.841)
	Trace[850581595]: [30.001289039s] [30.001289039s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> coredns [c4dc6059b215] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 257e111468ef6f1e36f10df061303186c353cd0e51aed8f50f4e4fd21cec02687aef97084fe1f82262f5cee88179d311670a6ae21ae185759728216fc264125f
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:55597 - 61955 "HINFO IN 5411809642052316829.545085282119266902. udp 56 false 512" NXDOMAIN qr,rd,ra 131 0.026601414s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1248174265]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.837) (total time: 30003ms):
	Trace[1248174265]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30002ms (22:37:45.839)
	Trace[1248174265]: [30.003765448s] [30.003765448s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[313955954]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.840) (total time: 30001ms):
	Trace[313955954]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (22:37:45.841)
	Trace[313955954]: [30.001623019s] [30.001623019s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1099528094]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Aug-2024 22:37:15.837) (total time: 30004ms):
	Trace[1099528094]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30004ms (22:37:45.842)
	Trace[1099528094]: [30.004679878s] [30.004679878s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> describe nodes <==
	Name:               ha-949000
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-949000
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=ha-949000
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_08_31T15_29_45_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:29:41 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-949000
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:50:27 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:49:16 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:49:16 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:49:16 +0000   Sat, 31 Aug 2024 22:29:40 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:49:16 +0000   Sat, 31 Aug 2024 22:37:06 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.5
	  Hostname:    ha-949000
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 758fb98d149341c7ae245ce9491d8a0f
	  System UUID:                98ca49d1-0000-0000-9e6c-321a4533d56e
	  Boot ID:                    3fc4eb3a-1e97-462c-91b1-b27289849703
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-5kkbw              0 (0%)        0 (0%)      0 (0%)           0 (0%)         18m
	  kube-system                 coredns-6f6b679f8f-kjszm             100m (5%)     0 (0%)      70Mi (3%)        170Mi (8%)     20m
	  kube-system                 coredns-6f6b679f8f-snq8s             100m (5%)     0 (0%)      70Mi (3%)        170Mi (8%)     20m
	  kube-system                 etcd-ha-949000                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         20m
	  kube-system                 kindnet-jzj42                        100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      20m
	  kube-system                 kube-apiserver-ha-949000             250m (12%)    0 (0%)      0 (0%)           0 (0%)         20m
	  kube-system                 kube-controller-manager-ha-949000    200m (10%)    0 (0%)      0 (0%)           0 (0%)         20m
	  kube-system                 kube-proxy-q7ndn                     0 (0%)        0 (0%)      0 (0%)           0 (0%)         20m
	  kube-system                 kube-scheduler-ha-949000             100m (5%)     0 (0%)      0 (0%)           0 (0%)         20m
	  kube-system                 kube-vip-ha-949000                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         13m
	  kube-system                 storage-provisioner                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         20m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%)   100m (5%)
	  memory             290Mi (13%)  390Mi (18%)
	  ephemeral-storage  0 (0%)       0 (0%)
	  hugepages-2Mi      0 (0%)       0 (0%)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 20m                    kube-proxy       
	  Normal  Starting                 5m50s                  kube-proxy       
	  Normal  Starting                 13m                    kube-proxy       
	  Normal  NodeHasSufficientMemory  20m                    kubelet          Node ha-949000 status is now: NodeHasSufficientMemory
	  Normal  NodeAllocatableEnforced  20m                    kubelet          Updated Node Allocatable limit across pods
	  Normal  Starting                 20m                    kubelet          Starting kubelet.
	  Normal  NodeHasNoDiskPressure    20m                    kubelet          Node ha-949000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     20m                    kubelet          Node ha-949000 status is now: NodeHasSufficientPID
	  Normal  RegisteredNode           20m                    node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  NodeReady                20m                    kubelet          Node ha-949000 status is now: NodeReady
	  Normal  RegisteredNode           19m                    node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           18m                    node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           16m                    node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  NodeAllocatableEnforced  14m                    kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientPID     14m (x7 over 14m)      kubelet          Node ha-949000 status is now: NodeHasSufficientPID
	  Normal  NodeHasNoDiskPressure    14m (x8 over 14m)      kubelet          Node ha-949000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientMemory  14m (x8 over 14m)      kubelet          Node ha-949000 status is now: NodeHasSufficientMemory
	  Normal  Starting                 14m                    kubelet          Starting kubelet.
	  Normal  RegisteredNode           13m                    node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           13m                    node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           12m                    node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  Starting                 7m16s                  kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  7m16s (x8 over 7m16s)  kubelet          Node ha-949000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    7m16s (x8 over 7m16s)  kubelet          Node ha-949000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     7m16s (x7 over 7m16s)  kubelet          Node ha-949000 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  7m16s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           6m21s                  node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           6m21s                  node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	  Normal  RegisteredNode           21s                    node-controller  Node ha-949000 event: Registered Node ha-949000 in Controller
	
	
	Name:               ha-949000-m02
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-949000-m02
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=ha-949000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_08_31T15_30_43_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:30:41 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-949000-m02
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:50:25 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:49:14 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:49:14 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:49:14 +0000   Sat, 31 Aug 2024 22:30:41 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:49:14 +0000   Sat, 31 Aug 2024 22:31:00 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.6
	  Hostname:    ha-949000-m02
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 65e22cd2b0314498aa33bf9e04730c6a
	  System UUID:                23e54f3d-0000-0000-86b7-b25c818528d1
	  Boot ID:                    1d744b30-5098-4929-bff2-54bd26848d21
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-6r9s5                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         18m
	  kube-system                 etcd-ha-949000-m02                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         19m
	  kube-system                 kindnet-brtj6                            100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      19m
	  kube-system                 kube-apiserver-ha-949000-m02             250m (12%)    0 (0%)      0 (0%)           0 (0%)         19m
	  kube-system                 kube-controller-manager-ha-949000-m02    200m (10%)    0 (0%)      0 (0%)           0 (0%)         19m
	  kube-system                 kube-proxy-4r2bt                         0 (0%)        0 (0%)      0 (0%)           0 (0%)         19m
	  kube-system                 kube-scheduler-ha-949000-m02             100m (5%)     0 (0%)      0 (0%)           0 (0%)         19m
	  kube-system                 kube-vip-ha-949000-m02                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         19m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%)  100m (5%)
	  memory             150Mi (7%)  50Mi (2%)
	  ephemeral-storage  0 (0%)      0 (0%)
	  hugepages-2Mi      0 (0%)      0 (0%)
	Events:
	  Type     Reason                   Age                    From             Message
	  ----     ------                   ----                   ----             -------
	  Normal   Starting                 6m6s                   kube-proxy       
	  Normal   Starting                 19m                    kube-proxy       
	  Normal   Starting                 16m                    kube-proxy       
	  Normal   Starting                 13m                    kube-proxy       
	  Normal   NodeHasSufficientMemory  19m (x8 over 19m)      kubelet          Node ha-949000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    19m (x8 over 19m)      kubelet          Node ha-949000-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeAllocatableEnforced  19m                    kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientPID     19m (x7 over 19m)      kubelet          Node ha-949000-m02 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           19m                    node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   RegisteredNode           19m                    node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   RegisteredNode           18m                    node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Warning  Rebooted                 16m                    kubelet          Node ha-949000-m02 has been rebooted, boot id: 4ddbe4b0-7ef0-4715-a631-f977c123c463
	  Normal   Starting                 16m                    kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  16m                    kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  16m                    kubelet          Node ha-949000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    16m                    kubelet          Node ha-949000-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     16m                    kubelet          Node ha-949000-m02 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           16m                    node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   Starting                 13m                    kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  13m                    kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasNoDiskPressure    13m (x8 over 13m)      kubelet          Node ha-949000-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientMemory  13m (x8 over 13m)      kubelet          Node ha-949000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasSufficientPID     13m (x7 over 13m)      kubelet          Node ha-949000-m02 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           13m                    node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   RegisteredNode           13m                    node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   RegisteredNode           12m                    node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   NodeHasNoDiskPressure    6m33s (x8 over 6m33s)  kubelet          Node ha-949000-m02 status is now: NodeHasNoDiskPressure
	  Normal   Starting                 6m33s                  kubelet          Starting kubelet.
	  Normal   NodeHasSufficientMemory  6m33s (x8 over 6m33s)  kubelet          Node ha-949000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasSufficientPID     6m33s (x7 over 6m33s)  kubelet          Node ha-949000-m02 status is now: NodeHasSufficientPID
	  Normal   NodeAllocatableEnforced  6m33s                  kubelet          Updated Node Allocatable limit across pods
	  Normal   RegisteredNode           6m21s                  node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   RegisteredNode           6m21s                  node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	  Normal   RegisteredNode           21s                    node-controller  Node ha-949000-m02 event: Registered Node ha-949000-m02 in Controller
	
	
	Name:               ha-949000-m05
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-949000-m05
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=ha-949000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_08_31T15_50_04_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:50:01 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-949000-m05
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 22:50:21 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 22:50:21 +0000   Sat, 31 Aug 2024 22:50:01 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 22:50:21 +0000   Sat, 31 Aug 2024 22:50:01 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 22:50:21 +0000   Sat, 31 Aug 2024 22:50:01 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 22:50:21 +0000   Sat, 31 Aug 2024 22:50:21 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.9
	  Hostname:    ha-949000-m05
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 148a5f1cdeba4c4bbbf298cdd4c0c720
	  System UUID:                52b14bf4-0000-0000-b4ee-182cd122edc6
	  Boot ID:                    88419081-82dc-4beb-973a-497fb1ecc332
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.2.0/24
	PodCIDRs:                     10.244.2.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-g8b59                  0 (0%)        0 (0%)      0 (0%)           0 (0%)         8m5s
	  kube-system                 etcd-ha-949000-m05                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         27s
	  kube-system                 kindnet-87plj                            100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      29s
	  kube-system                 kube-apiserver-ha-949000-m05             250m (12%)    0 (0%)      0 (0%)           0 (0%)         27s
	  kube-system                 kube-controller-manager-ha-949000-m05    200m (10%)    0 (0%)      0 (0%)           0 (0%)         27s
	  kube-system                 kube-proxy-fkqh2                         0 (0%)        0 (0%)      0 (0%)           0 (0%)         29s
	  kube-system                 kube-scheduler-ha-949000-m05             100m (5%)     0 (0%)      0 (0%)           0 (0%)         27s
	  kube-system                 kube-vip-ha-949000-m05                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         24s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%)  100m (5%)
	  memory             150Mi (7%)  50Mi (2%)
	  ephemeral-storage  0 (0%)      0 (0%)
	  hugepages-2Mi      0 (0%)      0 (0%)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 26s                kube-proxy       
	  Normal  NodeHasSufficientMemory  29s (x8 over 29s)  kubelet          Node ha-949000-m05 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    29s (x8 over 29s)  kubelet          Node ha-949000-m05 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     29s (x7 over 29s)  kubelet          Node ha-949000-m05 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  29s                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           26s                node-controller  Node ha-949000-m05 event: Registered Node ha-949000-m05 in Controller
	  Normal  RegisteredNode           26s                node-controller  Node ha-949000-m05 event: Registered Node ha-949000-m05 in Controller
	  Normal  RegisteredNode           21s                node-controller  Node ha-949000-m05 event: Registered Node ha-949000-m05 in Controller
	
	
	==> dmesg <==
	[  +0.000001] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.034690] ACPI BIOS Warning (bug): Incorrect checksum in table [DSDT] - 0xBE, should be 0x1B (20200925/tbprint-173)
	[  +0.008037] RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible!
	[Aug31 22:43] ACPI Error: Could not enable RealTimeClock event (20200925/evxfevnt-182)
	[  +0.000000] ACPI Warning: Could not enable fixed event - RealTimeClock (4) (20200925/evxface-618)
	[  +0.006696] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.610055] systemd-fstab-generator[126]: Ignoring "noauto" option for root device
	[  +2.275477] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000012] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000001] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +0.654488] systemd-fstab-generator[460]: Ignoring "noauto" option for root device
	[  +0.100018] systemd-fstab-generator[472]: Ignoring "noauto" option for root device
	[  +1.963372] systemd-fstab-generator[1084]: Ignoring "noauto" option for root device
	[  +0.243624] systemd-fstab-generator[1119]: Ignoring "noauto" option for root device
	[  +0.053609] kauditd_printk_skb: 101 callbacks suppressed
	[  +0.048738] systemd-fstab-generator[1131]: Ignoring "noauto" option for root device
	[  +0.109338] systemd-fstab-generator[1145]: Ignoring "noauto" option for root device
	[  +2.485755] systemd-fstab-generator[1361]: Ignoring "noauto" option for root device
	[  +0.105237] systemd-fstab-generator[1373]: Ignoring "noauto" option for root device
	[  +0.097354] systemd-fstab-generator[1385]: Ignoring "noauto" option for root device
	[  +0.120488] systemd-fstab-generator[1400]: Ignoring "noauto" option for root device
	[  +0.432211] systemd-fstab-generator[1563]: Ignoring "noauto" option for root device
	[  +6.838586] kauditd_printk_skb: 212 callbacks suppressed
	[ +21.319655] kauditd_printk_skb: 40 callbacks suppressed
	[Aug31 22:45] kauditd_printk_skb: 78 callbacks suppressed
	
	
	==> etcd [2255978551ea] <==
	{"level":"warn","ts":"2024-08-31T22:42:48.058603Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"5.232147243s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/volumeattachments/\" range_end:\"/registry/volumeattachments0\" count_only:true ","response":"","error":"context canceled"}
	{"level":"info","ts":"2024-08-31T22:42:48.058634Z","caller":"traceutil/trace.go:171","msg":"trace[1372143637] range","detail":"{range_begin:/registry/volumeattachments/; range_end:/registry/volumeattachments0; }","duration":"5.232181152s","start":"2024-08-31T22:42:42.826450Z","end":"2024-08-31T22:42:48.058631Z","steps":["trace[1372143637] 'agreement among raft nodes before linearized reading'  (duration: 5.232147269s)"],"step_count":1}
	{"level":"warn","ts":"2024-08-31T22:42:48.058649Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-08-31T22:42:42.826415Z","time spent":"5.232228764s","remote":"127.0.0.1:48278","response type":"/etcdserverpb.KV/Range","request count":0,"request size":62,"response count":0,"response size":0,"request content":"key:\"/registry/volumeattachments/\" range_end:\"/registry/volumeattachments0\" count_only:true "}
	2024/08/31 22:42:48 WARNING: [core] [Server #7] grpc: Server.processUnaryRPC failed to write status: connection error: desc = "transport is closing"
	{"level":"warn","ts":"2024-08-31T22:42:48.058725Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"4.593098977s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/apiextensions.k8s.io/customresourcedefinitions/\" range_end:\"/registry/apiextensions.k8s.io/customresourcedefinitions0\" count_only:true ","response":"","error":"context canceled"}
	{"level":"info","ts":"2024-08-31T22:42:48.058739Z","caller":"traceutil/trace.go:171","msg":"trace[371527565] range","detail":"{range_begin:/registry/apiextensions.k8s.io/customresourcedefinitions/; range_end:/registry/apiextensions.k8s.io/customresourcedefinitions0; }","duration":"4.593114862s","start":"2024-08-31T22:42:43.465620Z","end":"2024-08-31T22:42:48.058734Z","steps":["trace[371527565] 'agreement among raft nodes before linearized reading'  (duration: 4.593098849s)"],"step_count":1}
	{"level":"warn","ts":"2024-08-31T22:42:48.058751Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-08-31T22:42:43.465603Z","time spent":"4.593143993s","remote":"127.0.0.1:47898","response type":"/etcdserverpb.KV/Range","request count":0,"request size":120,"response count":0,"response size":0,"request content":"key:\"/registry/apiextensions.k8s.io/customresourcedefinitions/\" range_end:\"/registry/apiextensions.k8s.io/customresourcedefinitions0\" count_only:true "}
	2024/08/31 22:42:48 WARNING: [core] [Server #7] grpc: Server.processUnaryRPC failed to write status: connection error: desc = "transport is closing"
	{"level":"warn","ts":"2024-08-31T22:42:48.058755Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"1.313645842s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/health\" ","response":"","error":"context canceled"}
	{"level":"info","ts":"2024-08-31T22:42:48.058776Z","caller":"traceutil/trace.go:171","msg":"trace[1159639805] range","detail":"{range_begin:/registry/health; range_end:; }","duration":"1.313669945s","start":"2024-08-31T22:42:46.745100Z","end":"2024-08-31T22:42:48.058770Z","steps":["trace[1159639805] 'agreement among raft nodes before linearized reading'  (duration: 1.313645254s)"],"step_count":1}
	{"level":"warn","ts":"2024-08-31T22:42:48.058793Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-08-31T22:42:46.745083Z","time spent":"1.313705515s","remote":"127.0.0.1:42196","response type":"/etcdserverpb.KV/Range","request count":0,"request size":18,"response count":0,"response size":0,"request content":"key:\"/registry/health\" "}
	2024/08/31 22:42:48 WARNING: [core] [Server #7] grpc: Server.processUnaryRPC failed to write status: connection error: desc = "transport is closing"
	{"level":"warn","ts":"2024-08-31T22:42:48.096976Z","caller":"embed/serve.go:212","msg":"stopping secure grpc server due to error","error":"accept tcp 192.169.0.5:2379: use of closed network connection"}
	{"level":"warn","ts":"2024-08-31T22:42:48.097045Z","caller":"embed/serve.go:214","msg":"stopped secure grpc server due to error","error":"accept tcp 192.169.0.5:2379: use of closed network connection"}
	{"level":"info","ts":"2024-08-31T22:42:48.097084Z","caller":"etcdserver/server.go:1512","msg":"skipped leadership transfer; local server is not leader","local-member-id":"b8c6c7563d17d844","current-leader-member-id":"0"}
	{"level":"info","ts":"2024-08-31T22:42:48.097205Z","caller":"rafthttp/peer.go:330","msg":"stopping remote peer","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:42:48.097236Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:42:48.097251Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream Message","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:42:48.097335Z","caller":"rafthttp/pipeline.go:85","msg":"stopped HTTP pipelining with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:42:48.097380Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:42:48.097428Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:42:48.097439Z","caller":"rafthttp/peer.go:335","msg":"stopped remote peer","remote-peer-id":"316786cc150e7430"}
	{"level":"info","ts":"2024-08-31T22:42:48.098722Z","caller":"embed/etcd.go:581","msg":"stopping serving peer traffic","address":"192.169.0.5:2380"}
	{"level":"info","ts":"2024-08-31T22:42:48.098784Z","caller":"embed/etcd.go:586","msg":"stopped serving peer traffic","address":"192.169.0.5:2380"}
	{"level":"info","ts":"2024-08-31T22:42:48.098805Z","caller":"embed/etcd.go:379","msg":"closed etcd server","name":"ha-949000","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.169.0.5:2380"],"advertise-client-urls":["https://192.169.0.5:2379"]}
	
	
	==> etcd [c016f5fcb7d7] <==
	{"level":"info","ts":"2024-08-31T22:50:02.516410Z","caller":"rafthttp/peer.go:137","msg":"started remote peer","remote-peer-id":"3a6582acf2ee6e0d"}
	{"level":"info","ts":"2024-08-31T22:50:02.516643Z","caller":"rafthttp/transport.go:317","msg":"added remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"3a6582acf2ee6e0d","remote-peer-urls":["https://192.169.0.9:2380"]}
	{"level":"info","ts":"2024-08-31T22:50:02.516574Z","caller":"rafthttp/stream.go:395","msg":"started stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"3a6582acf2ee6e0d"}
	{"level":"info","ts":"2024-08-31T22:50:02.516562Z","caller":"rafthttp/stream.go:169","msg":"started stream writer with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"3a6582acf2ee6e0d"}
	{"level":"info","ts":"2024-08-31T22:50:02.516611Z","caller":"rafthttp/stream.go:395","msg":"started stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"3a6582acf2ee6e0d"}
	{"level":"info","ts":"2024-08-31T22:50:02.516441Z","caller":"rafthttp/stream.go:169","msg":"started stream writer with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"3a6582acf2ee6e0d"}
	{"level":"warn","ts":"2024-08-31T22:50:02.550990Z","caller":"etcdhttp/peer.go:150","msg":"failed to promote a member","member-id":"3a6582acf2ee6e0d","error":"etcdserver: can only promote a learner member which is in sync with leader"}
	{"level":"warn","ts":"2024-08-31T22:50:03.033932Z","caller":"etcdhttp/peer.go:150","msg":"failed to promote a member","member-id":"3a6582acf2ee6e0d","error":"etcdserver: can only promote a learner member which is in sync with leader"}
	{"level":"info","ts":"2024-08-31T22:50:03.604723Z","caller":"rafthttp/peer_status.go:53","msg":"peer became active","peer-id":"3a6582acf2ee6e0d"}
	{"level":"info","ts":"2024-08-31T22:50:03.604819Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"3a6582acf2ee6e0d"}
	{"level":"info","ts":"2024-08-31T22:50:03.608105Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"3a6582acf2ee6e0d"}
	{"level":"info","ts":"2024-08-31T22:50:03.630810Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"3a6582acf2ee6e0d","stream-type":"stream MsgApp v2"}
	{"level":"info","ts":"2024-08-31T22:50:03.630892Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"3a6582acf2ee6e0d"}
	{"level":"info","ts":"2024-08-31T22:50:03.634764Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"3a6582acf2ee6e0d","stream-type":"stream Message"}
	{"level":"info","ts":"2024-08-31T22:50:03.634899Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"3a6582acf2ee6e0d"}
	{"level":"warn","ts":"2024-08-31T22:50:03.637106Z","caller":"embed/config_logging.go:170","msg":"rejected connection on peer endpoint","remote-addr":"192.169.0.9:37566","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2024-08-31T22:50:03.651001Z","caller":"embed/config_logging.go:170","msg":"rejected connection on peer endpoint","remote-addr":"192.169.0.9:37616","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2024-08-31T22:50:03.658007Z","caller":"embed/config_logging.go:170","msg":"rejected connection on peer endpoint","remote-addr":"192.169.0.9:37646","server-name":"","error":"read tcp 192.169.0.5:2380->192.169.0.9:37646: read: connection reset by peer"}
	{"level":"warn","ts":"2024-08-31T22:50:03.658732Z","caller":"embed/config_logging.go:170","msg":"rejected connection on peer endpoint","remote-addr":"192.169.0.9:37630","server-name":"","error":"read tcp 192.169.0.5:2380->192.169.0.9:37630: read: connection reset by peer"}
	{"level":"warn","ts":"2024-08-31T22:50:03.659110Z","caller":"embed/config_logging.go:170","msg":"rejected connection on peer endpoint","remote-addr":"192.169.0.9:37662","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2024-08-31T22:50:04.012262Z","caller":"etcdhttp/peer.go:150","msg":"failed to promote a member","member-id":"3a6582acf2ee6e0d","error":"etcdserver: can only promote a learner member which is in sync with leader"}
	{"level":"info","ts":"2024-08-31T22:50:04.515931Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 switched to configuration voters=(3559962241544385584 4207913106169294349 13314548521573537860)"}
	{"level":"info","ts":"2024-08-31T22:50:04.516001Z","caller":"membership/cluster.go:535","msg":"promote member","cluster-id":"b73189effde9bc63","local-member-id":"b8c6c7563d17d844"}
	{"level":"info","ts":"2024-08-31T22:50:04.516019Z","caller":"etcdserver/server.go:1996","msg":"applied a configuration change through raft","local-member-id":"b8c6c7563d17d844","raft-conf-change":"ConfChangeAddNode","raft-conf-change-node-id":"3a6582acf2ee6e0d"}
	{"level":"info","ts":"2024-08-31T22:50:31.009418Z","caller":"traceutil/trace.go:171","msg":"trace[953355298] transaction","detail":"{read_only:false; response_revision:4077; number_of_response:1; }","duration":"108.695228ms","start":"2024-08-31T22:50:30.900705Z","end":"2024-08-31T22:50:31.009400Z","steps":["trace[953355298] 'process raft request'  (duration: 108.246851ms)"],"step_count":1}
	
	
	==> kernel <==
	 22:50:31 up 7 min,  0 users,  load average: 0.34, 0.22, 0.10
	Linux ha-949000 5.10.207 #1 SMP Wed Aug 28 20:54:17 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [39caece4a1a0] <==
	I0831 22:49:50.855920       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:49:50.855960       1 main.go:299] handling current node
	I0831 22:50:00.860784       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:50:00.860826       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:50:00.861082       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:50:00.861114       1 main.go:299] handling current node
	I0831 22:50:10.855197       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:50:10.855250       1 main.go:299] handling current node
	I0831 22:50:10.855263       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:50:10.855268       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:50:10.855601       1 main.go:295] Handling node with IPs: map[192.169.0.9:{}]
	I0831 22:50:10.855634       1 main.go:322] Node ha-949000-m05 has CIDR [10.244.2.0/24] 
	I0831 22:50:10.855709       1 routes.go:62] Adding route {Ifindex: 0 Dst: 10.244.2.0/24 Src: <nil> Gw: 192.169.0.9 Flags: [] Table: 0} 
	I0831 22:50:20.857498       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:50:20.857554       1 main.go:299] handling current node
	I0831 22:50:20.857568       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:50:20.857575       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:50:20.857959       1 main.go:295] Handling node with IPs: map[192.169.0.9:{}]
	I0831 22:50:20.858000       1 main.go:322] Node ha-949000-m05 has CIDR [10.244.2.0/24] 
	I0831 22:50:30.860590       1 main.go:295] Handling node with IPs: map[192.169.0.9:{}]
	I0831 22:50:30.860753       1 main.go:322] Node ha-949000-m05 has CIDR [10.244.2.0/24] 
	I0831 22:50:30.860905       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:50:30.861004       1 main.go:299] handling current node
	I0831 22:50:30.861054       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:50:30.861131       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	
	
	==> kindnet [ff98d7e38a1e] <==
	I0831 22:42:06.419355       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:42:06.419448       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:42:06.419540       1 main.go:299] handling current node
	I0831 22:42:06.419587       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:42:06.419596       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:42:16.418758       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:42:16.418878       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:42:16.419144       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:42:16.419199       1 main.go:299] handling current node
	I0831 22:42:16.419230       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:42:16.419256       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:42:26.418790       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:42:26.418914       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:42:26.419229       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0831 22:42:26.419399       1 main.go:322] Node ha-949000-m03 has CIDR [10.244.2.0/24] 
	I0831 22:42:26.419700       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:42:26.419804       1 main.go:299] handling current node
	I0831 22:42:36.424537       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:42:36.424582       1 main.go:299] handling current node
	I0831 22:42:36.424595       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:42:36.424600       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	I0831 22:42:46.420454       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0831 22:42:46.420626       1 main.go:299] handling current node
	I0831 22:42:46.420750       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0831 22:42:46.420997       1 main.go:322] Node ha-949000-m02 has CIDR [10.244.1.0/24] 
	
	
	==> kube-apiserver [6966a01f9623] <==
	I0831 22:43:21.428428       1 options.go:228] external host was not specified, using 192.169.0.5
	I0831 22:43:21.432400       1 server.go:142] Version: v1.31.0
	I0831 22:43:21.432438       1 server.go:144] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:43:22.144103       1 shared_informer.go:313] Waiting for caches to sync for node_authorizer
	I0831 22:43:22.155916       1 shared_informer.go:313] Waiting for caches to sync for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0831 22:43:22.159940       1 plugins.go:157] Loaded 12 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,ServiceAccount,NodeRestriction,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,RuntimeClass,DefaultIngressClass,MutatingAdmissionWebhook.
	I0831 22:43:22.160055       1 plugins.go:160] Loaded 13 validating admission controller(s) successfully in the following order: LimitRanger,ServiceAccount,PodSecurity,Priority,PersistentVolumeClaimResize,RuntimeClass,CertificateApproval,CertificateSigning,ClusterTrustBundleAttest,CertificateSubjectRestriction,ValidatingAdmissionPolicy,ValidatingAdmissionWebhook,ResourceQuota.
	I0831 22:43:22.162610       1 instance.go:232] Using reconciler: lease
	W0831 22:43:42.140091       1 logging.go:55] [core] [Channel #1 SubChannel #3]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	W0831 22:43:42.143285       1 logging.go:55] [core] [Channel #2 SubChannel #4]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	F0831 22:43:42.166382       1 instance.go:225] Error creating leases: error creating storage factory: context deadline exceeded
	
	
	==> kube-apiserver [8be9164123bc] <==
	I0831 22:44:06.093114       1 cache.go:39] Caches are synced for LocalAvailability controller
	I0831 22:44:06.093378       1 shared_informer.go:320] Caches are synced for configmaps
	I0831 22:44:06.093412       1 apf_controller.go:382] Running API Priority and Fairness config worker
	I0831 22:44:06.093670       1 apf_controller.go:385] Running API Priority and Fairness periodic rebalancing process
	I0831 22:44:06.100700       1 cache.go:39] Caches are synced for APIServiceRegistrationController controller
	I0831 22:44:06.107001       1 shared_informer.go:320] Caches are synced for crd-autoregister
	I0831 22:44:06.107526       1 aggregator.go:171] initial CRD sync complete...
	I0831 22:44:06.107618       1 autoregister_controller.go:144] Starting autoregister controller
	I0831 22:44:06.107626       1 cache.go:32] Waiting for caches to sync for autoregister controller
	I0831 22:44:06.107667       1 cache.go:39] Caches are synced for autoregister controller
	I0831 22:44:06.117188       1 shared_informer.go:320] Caches are synced for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0831 22:44:06.117360       1 policy_source.go:224] refreshing policies
	I0831 22:44:06.117520       1 handler_discovery.go:450] Starting ResourceDiscoveryManager
	I0831 22:44:06.117786       1 shared_informer.go:320] Caches are synced for node_authorizer
	E0831 22:44:06.126468       1 controller.go:97] Error removing old endpoints from kubernetes service: no API server IP addresses were listed in storage, refusing to erase all endpoints for the kubernetes Service
	I0831 22:44:06.191208       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0831 22:44:06.997070       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	W0831 22:44:07.236923       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.169.0.5]
	I0831 22:44:07.238029       1 controller.go:615] quota admission added evaluator for: endpoints
	I0831 22:44:07.242198       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	E0831 22:50:02.403203       1 writers.go:122] "Unhandled Error" err="apiserver was unable to write a JSON response: http: Handler timeout" logger="UnhandledError"
	E0831 22:50:02.403296       1 finisher.go:175] "Unhandled Error" err="FinishRequest: post-timeout activity - time-elapsed: 4.546µs, panicked: false, err: context canceled, panic-reason: <nil>" logger="UnhandledError"
	E0831 22:50:02.404789       1 status.go:71] "Unhandled Error" err="apiserver received an error that is not an metav1.Status: &errors.errorString{s:\"http: Handler timeout\"}: http: Handler timeout" logger="UnhandledError"
	E0831 22:50:02.406142       1 writers.go:135] "Unhandled Error" err="apiserver was unable to write a fallback JSON response: http: Handler timeout" logger="UnhandledError"
	E0831 22:50:02.407770       1 timeout.go:140] "Post-timeout activity" logger="UnhandledError" timeElapsed="4.619473ms" method="POST" path="/api/v1/namespaces/kube-system/events" result=null
	
	
	==> kube-controller-manager [981e8e790a39] <==
	I0831 22:43:21.974417       1 serving.go:386] Generated self-signed cert in-memory
	I0831 22:43:22.496926       1 controllermanager.go:197] "Starting" version="v1.31.0"
	I0831 22:43:22.497066       1 controllermanager.go:199] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:43:22.499991       1 secure_serving.go:213] Serving securely on 127.0.0.1:10257
	I0831 22:43:22.500099       1 tlsconfig.go:243] "Starting DynamicServingCertificateController"
	I0831 22:43:22.500173       1 dynamic_cafile_content.go:160] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0831 22:43:22.500184       1 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	E0831 22:43:43.177282       1 controllermanager.go:242] "Error building controller context" err="failed to wait for apiserver being healthy: timed out waiting for the condition: failed to get apiserver /healthz status: Get \"https://192.169.0.5:8443/healthz\": dial tcp 192.169.0.5:8443: connect: connection refused"
	
	
	==> kube-controller-manager [ca5e9a101fac] <==
	I0831 22:50:01.582672       1 actual_state_of_world.go:540] "Failed to update statusUpdateNeeded field in actual state of world" logger="persistentvolume-attach-detach-controller" err="Failed to set statusUpdateNeeded to needed true, because nodeName=\"ha-949000-m05\" does not exist"
	I0831 22:50:01.595457       1 range_allocator.go:422] "Set node PodCIDR" logger="node-ipam-controller" node="ha-949000-m05" podCIDRs=["10.244.2.0/24"]
	I0831 22:50:01.595533       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m05"
	I0831 22:50:01.595588       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m05"
	I0831 22:50:01.638118       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m05"
	I0831 22:50:01.712334       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="44.97µs"
	I0831 22:50:02.492784       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m05"
	I0831 22:50:04.594127       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m05"
	I0831 22:50:04.679854       1 node_lifecycle_controller.go:884] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-949000-m05"
	I0831 22:50:04.716598       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m05"
	I0831 22:50:05.105761       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m05"
	I0831 22:50:05.187024       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m05"
	I0831 22:50:05.274752       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m05"
	I0831 22:50:05.340107       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m05"
	I0831 22:50:09.287979       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m05"
	I0831 22:50:09.379791       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m05"
	I0831 22:50:11.801037       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m05"
	I0831 22:50:22.154190       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m05"
	I0831 22:50:22.164916       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m05"
	I0831 22:50:22.169625       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="33.081µs"
	I0831 22:50:22.179598       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="34.577µs"
	I0831 22:50:22.188467       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="34.485µs"
	I0831 22:50:24.232786       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="ha-949000-m05"
	I0831 22:50:24.647787       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="15.071956ms"
	I0831 22:50:24.648076       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="106.201µs"
	
	
	==> kube-proxy [ce00ce382bb0] <==
		add table ip kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	E0831 22:44:39.825017       1 proxier.go:734] "Error cleaning up nftables rules" err=<
		could not run nftables command: /dev/stdin:1:1-25: Error: Could not process rule: Operation not supported
		add table ip6 kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	I0831 22:44:39.836111       1 server.go:677] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	E0831 22:44:39.836175       1 server.go:234] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I0831 22:44:39.866373       1 server_linux.go:146] "No iptables support for family" ipFamily="IPv6"
	I0831 22:44:39.866419       1 server.go:245] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0831 22:44:39.866438       1 server_linux.go:169] "Using iptables Proxier"
	I0831 22:44:39.868916       1 proxier.go:255] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I0831 22:44:39.869454       1 server.go:483] "Version info" version="v1.31.0"
	I0831 22:44:39.869482       1 server.go:485] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:44:39.871479       1 config.go:197] "Starting service config controller"
	I0831 22:44:39.871768       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0831 22:44:39.871891       1 config.go:104] "Starting endpoint slice config controller"
	I0831 22:44:39.871917       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0831 22:44:39.872900       1 config.go:326] "Starting node config controller"
	I0831 22:44:39.872926       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0831 22:44:39.972753       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0831 22:44:39.972790       1 shared_informer.go:320] Caches are synced for service config
	I0831 22:44:39.973169       1 shared_informer.go:320] Caches are synced for node config
	
	
	==> kube-proxy [f89b86206413] <==
		add table ip kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	E0831 22:37:16.195275       1 proxier.go:734] "Error cleaning up nftables rules" err=<
		could not run nftables command: /dev/stdin:1:1-25: Error: Could not process rule: Operation not supported
		add table ip6 kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	I0831 22:37:16.220357       1 server.go:677] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	E0831 22:37:16.220590       1 server.go:234] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I0831 22:37:16.265026       1 server_linux.go:146] "No iptables support for family" ipFamily="IPv6"
	I0831 22:37:16.265177       1 server.go:245] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0831 22:37:16.265305       1 server_linux.go:169] "Using iptables Proxier"
	I0831 22:37:16.268348       1 proxier.go:255] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I0831 22:37:16.268734       1 server.go:483] "Version info" version="v1.31.0"
	I0831 22:37:16.269061       1 server.go:485] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:37:16.272514       1 config.go:197] "Starting service config controller"
	I0831 22:37:16.273450       1 config.go:104] "Starting endpoint slice config controller"
	I0831 22:37:16.273658       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0831 22:37:16.273777       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0831 22:37:16.275413       1 config.go:326] "Starting node config controller"
	I0831 22:37:16.277042       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0831 22:37:16.374257       1 shared_informer.go:320] Caches are synced for service config
	I0831 22:37:16.375624       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0831 22:37:16.377606       1 shared_informer.go:320] Caches are synced for node config
	
	
	==> kube-scheduler [5b0ac6b7faf7] <==
	I0831 22:36:35.937574       1 serving.go:386] Generated self-signed cert in-memory
	W0831 22:36:46.491998       1 authentication.go:370] Error looking up in-cluster authentication configuration: Get "https://192.169.0.5:8443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication": net/http: TLS handshake timeout
	W0831 22:36:46.492020       1 authentication.go:371] Continuing without authentication configuration. This may treat all requests as anonymous.
	W0831 22:36:46.492025       1 authentication.go:372] To require authentication configuration lookup to succeed, set --authentication-tolerate-lookup-failure=false
	I0831 22:36:55.901677       1 server.go:167] "Starting Kubernetes Scheduler" version="v1.31.0"
	I0831 22:36:55.901714       1 server.go:169] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:36:55.904943       1 configmap_cafile_content.go:205] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I0831 22:36:55.905195       1 secure_serving.go:213] Serving securely on 127.0.0.1:10259
	I0831 22:36:55.905729       1 tlsconfig.go:243] "Starting DynamicServingCertificateController"
	I0831 22:36:55.906036       1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I0831 22:36:56.006746       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	E0831 22:42:48.053419       1 run.go:72] "command failed" err="finished without leader elect"
	
	
	==> kube-scheduler [6c320a1f78ae] <==
	W0831 22:44:06.042112       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PodDisruptionBudget: poddisruptionbudgets.policy is forbidden: User "system:kube-scheduler" cannot list resource "poddisruptionbudgets" in API group "policy" at the cluster scope
	E0831 22:44:06.042218       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: poddisruptionbudgets.policy is forbidden: User \"system:kube-scheduler\" cannot list resource \"poddisruptionbudgets\" in API group \"policy\" at the cluster scope" logger="UnhandledError"
	W0831 22:44:06.042375       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0831 22:44:06.042426       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"storageclasses\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError"
	W0831 22:44:06.042804       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csistoragecapacities" in API group "storage.k8s.io" at the cluster scope
	E0831 22:44:06.042841       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csistoragecapacities\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError"
	W0831 22:44:06.061311       1 reflector.go:561] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0831 22:44:06.061431       1 reflector.go:158] "Unhandled Error" err="runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"extension-apiserver-authentication\" is forbidden: User \"system:kube-scheduler\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\"" logger="UnhandledError"
	I0831 22:44:21.686813       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	E0831 22:50:01.637948       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kube-proxy-fkqh2\": pod kube-proxy-fkqh2 is already assigned to node \"ha-949000-m05\"" plugin="DefaultBinder" pod="kube-system/kube-proxy-fkqh2" node="ha-949000-m05"
	E0831 22:50:01.639154       1 schedule_one.go:348] "scheduler cache ForgetPod failed" err="pod 9d5106a5-75a2-48f6-b8e4-544b3b2c18af(kube-system/kube-proxy-fkqh2) wasn't assumed so cannot be forgotten" pod="kube-system/kube-proxy-fkqh2"
	E0831 22:50:01.639334       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kube-proxy-fkqh2\": pod kube-proxy-fkqh2 is already assigned to node \"ha-949000-m05\"" pod="kube-system/kube-proxy-fkqh2"
	I0831 22:50:01.639507       1 schedule_one.go:1070] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kube-proxy-fkqh2" node="ha-949000-m05"
	E0831 22:50:01.639077       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kindnet-87plj\": pod kindnet-87plj is already assigned to node \"ha-949000-m05\"" plugin="DefaultBinder" pod="kube-system/kindnet-87plj" node="ha-949000-m05"
	E0831 22:50:01.640771       1 schedule_one.go:348] "scheduler cache ForgetPod failed" err="pod fa3de3e5-787f-400d-91fb-fceb2e3f2947(kube-system/kindnet-87plj) wasn't assumed so cannot be forgotten" pod="kube-system/kindnet-87plj"
	E0831 22:50:01.640782       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kindnet-87plj\": pod kindnet-87plj is already assigned to node \"ha-949000-m05\"" pod="kube-system/kindnet-87plj"
	I0831 22:50:01.640791       1 schedule_one.go:1070] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kindnet-87plj" node="ha-949000-m05"
	E0831 22:50:01.665403       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kindnet-pzqh5\": pod kindnet-pzqh5 is already assigned to node \"ha-949000-m05\"" plugin="DefaultBinder" pod="kube-system/kindnet-pzqh5" node="ha-949000-m05"
	E0831 22:50:01.665493       1 schedule_one.go:348] "scheduler cache ForgetPod failed" err="pod 83a39572-87a8-455a-828e-d87bf0544d82(kube-system/kindnet-pzqh5) wasn't assumed so cannot be forgotten" pod="kube-system/kindnet-pzqh5"
	E0831 22:50:01.665542       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kindnet-pzqh5\": pod kindnet-pzqh5 is already assigned to node \"ha-949000-m05\"" pod="kube-system/kindnet-pzqh5"
	I0831 22:50:01.665621       1 schedule_one.go:1070] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kindnet-pzqh5" node="ha-949000-m05"
	E0831 22:50:01.671248       1 framework.go:1305] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kube-proxy-64z94\": pod kube-proxy-64z94 is already assigned to node \"ha-949000-m05\"" plugin="DefaultBinder" pod="kube-system/kube-proxy-64z94" node="ha-949000-m05"
	E0831 22:50:01.671315       1 schedule_one.go:348] "scheduler cache ForgetPod failed" err="pod 4aae5b59-64ae-4f57-b63e-1bd2fc528fdc(kube-system/kube-proxy-64z94) wasn't assumed so cannot be forgotten" pod="kube-system/kube-proxy-64z94"
	E0831 22:50:01.671360       1 schedule_one.go:1057] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kube-proxy-64z94\": pod kube-proxy-64z94 is already assigned to node \"ha-949000-m05\"" pod="kube-system/kube-proxy-64z94"
	I0831 22:50:01.671377       1 schedule_one.go:1070] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kube-proxy-64z94" node="ha-949000-m05"
	
	
	==> kubelet <==
	Aug 31 22:46:14 ha-949000 kubelet[1570]: E0831 22:46:14.356754    1570 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:46:14 ha-949000 kubelet[1570]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:46:14 ha-949000 kubelet[1570]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:46:14 ha-949000 kubelet[1570]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:46:14 ha-949000 kubelet[1570]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:47:14 ha-949000 kubelet[1570]: E0831 22:47:14.357076    1570 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:47:14 ha-949000 kubelet[1570]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:47:14 ha-949000 kubelet[1570]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:47:14 ha-949000 kubelet[1570]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:47:14 ha-949000 kubelet[1570]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:48:14 ha-949000 kubelet[1570]: E0831 22:48:14.356617    1570 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:48:14 ha-949000 kubelet[1570]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:48:14 ha-949000 kubelet[1570]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:48:14 ha-949000 kubelet[1570]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:48:14 ha-949000 kubelet[1570]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:49:14 ha-949000 kubelet[1570]: E0831 22:49:14.356648    1570 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:49:14 ha-949000 kubelet[1570]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:49:14 ha-949000 kubelet[1570]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:49:14 ha-949000 kubelet[1570]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:49:14 ha-949000 kubelet[1570]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 22:50:14 ha-949000 kubelet[1570]: E0831 22:50:14.357051    1570 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 22:50:14 ha-949000 kubelet[1570]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 22:50:14 ha-949000 kubelet[1570]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 22:50:14 ha-949000 kubelet[1570]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 22:50:14 ha-949000 kubelet[1570]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:255: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p ha-949000 -n ha-949000
helpers_test.go:262: (dbg) Run:  kubectl --context ha-949000 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:286: <<< TestMultiControlPlane/serial/AddSecondaryNode FAILED: end of post-mortem logs <<<
helpers_test.go:287: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/AddSecondaryNode (79.31s)

                                                
                                    
x
+
TestMinikubeProfile (86.38s)

                                                
                                                
=== RUN   TestMinikubeProfile
minikube_profile_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p first-785000 --driver=hyperkit 
E0831 15:54:15.432835    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
minikube_profile_test.go:44: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p first-785000 --driver=hyperkit : exit status 90 (1m20.519760083s)

                                                
                                                
-- stdout --
	* [first-785000] minikube v1.33.1 on Darwin 14.6.1
	  - MINIKUBE_LOCATION=18943
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting "first-785000" primary control-plane node in "first-785000" cluster
	* Creating hyperkit VM (CPUs=2, Memory=6000MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart docker: Process exited with status 1
	stdout:
	
	stderr:
	Job for docker.service failed because the control process exited with error code.
	See "systemctl status docker.service" and "journalctl -xeu docker.service" for details.
	
	sudo journalctl --no-pager -u docker:
	-- stdout --
	Aug 31 22:53:25 first-785000 systemd[1]: Starting Docker Application Container Engine...
	Aug 31 22:53:25 first-785000 dockerd[537]: time="2024-08-31T22:53:25.199769128Z" level=info msg="Starting up"
	Aug 31 22:53:25 first-785000 dockerd[537]: time="2024-08-31T22:53:25.200218479Z" level=info msg="containerd not running, starting managed containerd"
	Aug 31 22:53:25 first-785000 dockerd[537]: time="2024-08-31T22:53:25.200785177Z" level=info msg="started new containerd process" address=/var/run/docker/containerd/containerd.sock module=libcontainerd pid=544
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.217520251Z" level=info msg="starting containerd" revision=472731909fa34bd7bc9c087e4c27943f9835f111 version=v1.7.21
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.232811733Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.232879446Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.232942738Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.232980183Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.233055773Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.233091731Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.233231369Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.233271597Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.233306071Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.233337913Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.233414327Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.233587338Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.235607468Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/5.10.207\\n\"): skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.235658540Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.235830547Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.235877619Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.235966519Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.236032648Z" level=info msg="metadata content store policy set" policy=shared
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.321291891Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.321415842Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.321465741Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.321501066Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.321533306Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.321681432Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.321953920Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322066355Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322105703Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322137954Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322168747Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322202327Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322231671Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322262303Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322293595Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322334252Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322369246Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322397935Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322435460Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322470033Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322500758Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322531499Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322560989Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322630946Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322673841Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322704557Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322781941Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322827159Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322858121Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322887742Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322917820Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322949496Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.322985639Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.323018454Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.323048875Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.323124227Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.323169728Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.323202814Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.323232651Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.323260511Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.323289608Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.323317909Z" level=info msg="NRI interface is disabled by configuration."
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.323516291Z" level=info msg=serving... address=/var/run/docker/containerd/containerd-debug.sock
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.323615334Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock.ttrpc
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.323682051Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock
	Aug 31 22:53:25 first-785000 dockerd[544]: time="2024-08-31T22:53:25.323749120Z" level=info msg="containerd successfully booted in 0.106830s"
	Aug 31 22:53:26 first-785000 dockerd[537]: time="2024-08-31T22:53:26.254699866Z" level=info msg="[graphdriver] trying configured driver: overlay2"
	Aug 31 22:53:26 first-785000 dockerd[537]: time="2024-08-31T22:53:26.258469860Z" level=info msg="Loading containers: start."
	Aug 31 22:53:26 first-785000 dockerd[537]: time="2024-08-31T22:53:26.340902688Z" level=warning msg="ip6tables is enabled, but cannot set up ip6tables chains" error="failed to create NAT chain DOCKER: iptables failed: ip6tables --wait -t nat -N DOCKER: ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)\nPerhaps ip6tables or your kernel needs to be upgraded.\n (exit status 3)"
	Aug 31 22:53:26 first-785000 dockerd[537]: time="2024-08-31T22:53:26.433037930Z" level=info msg="Loading containers: done."
	Aug 31 22:53:26 first-785000 dockerd[537]: time="2024-08-31T22:53:26.444053249Z" level=info msg="Docker daemon" commit=3ab5c7d0 containerd-snapshotter=false storage-driver=overlay2 version=27.2.0
	Aug 31 22:53:26 first-785000 dockerd[537]: time="2024-08-31T22:53:26.444202670Z" level=info msg="Daemon has completed initialization"
	Aug 31 22:53:26 first-785000 dockerd[537]: time="2024-08-31T22:53:26.471083548Z" level=info msg="API listen on /var/run/docker.sock"
	Aug 31 22:53:26 first-785000 dockerd[537]: time="2024-08-31T22:53:26.471244741Z" level=info msg="API listen on [::]:2376"
	Aug 31 22:53:26 first-785000 systemd[1]: Started Docker Application Container Engine.
	Aug 31 22:53:27 first-785000 dockerd[537]: time="2024-08-31T22:53:27.425289131Z" level=info msg="Processing signal 'terminated'"
	Aug 31 22:53:27 first-785000 dockerd[537]: time="2024-08-31T22:53:27.426679450Z" level=info msg="stopping event stream following graceful shutdown" error="<nil>" module=libcontainerd namespace=moby
	Aug 31 22:53:27 first-785000 systemd[1]: Stopping Docker Application Container Engine...
	Aug 31 22:53:27 first-785000 dockerd[537]: time="2024-08-31T22:53:27.427475937Z" level=info msg="Daemon shutdown complete"
	Aug 31 22:53:27 first-785000 dockerd[537]: time="2024-08-31T22:53:27.427585237Z" level=info msg="stopping event stream following graceful shutdown" error="context canceled" module=libcontainerd namespace=plugins.moby
	Aug 31 22:53:27 first-785000 dockerd[537]: time="2024-08-31T22:53:27.427625761Z" level=info msg="stopping healthcheck following graceful shutdown" module=libcontainerd
	Aug 31 22:53:28 first-785000 systemd[1]: docker.service: Deactivated successfully.
	Aug 31 22:53:28 first-785000 systemd[1]: Stopped Docker Application Container Engine.
	Aug 31 22:53:28 first-785000 systemd[1]: Starting Docker Application Container Engine...
	Aug 31 22:53:28 first-785000 dockerd[878]: time="2024-08-31T22:53:28.467598629Z" level=info msg="Starting up"
	Aug 31 22:54:28 first-785000 dockerd[878]: failed to start daemon: failed to dial "/run/containerd/containerd.sock": failed to dial "/run/containerd/containerd.sock": context deadline exceeded
	Aug 31 22:54:28 first-785000 systemd[1]: docker.service: Main process exited, code=exited, status=1/FAILURE
	Aug 31 22:54:28 first-785000 systemd[1]: docker.service: Failed with result 'exit-code'.
	Aug 31 22:54:28 first-785000 systemd[1]: Failed to start Docker Application Container Engine.
	
	-- /stdout --
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
minikube_profile_test.go:46: test pre-condition failed. args "out/minikube-darwin-amd64 start -p first-785000 --driver=hyperkit ": exit status 90
panic.go:626: *** TestMinikubeProfile FAILED at 2024-08-31 15:54:28.992756 -0700 PDT m=+2966.218479792
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p second-787000 -n second-787000
helpers_test.go:240: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p second-787000 -n second-787000: exit status 85 (123.069705ms)

                                                
                                                
-- stdout --
	* Profile "second-787000" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p second-787000"

                                                
                                                
-- /stdout --
helpers_test.go:240: status error: exit status 85 (may be ok)
helpers_test.go:242: "second-787000" host is not running, skipping log retrieval (state="* Profile \"second-787000\" not found. Run \"minikube profile list\" to view all profiles.\n  To start a cluster, run: \"minikube start -p second-787000\"")
helpers_test.go:176: Cleaning up "second-787000" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 delete -p second-787000
panic.go:626: *** TestMinikubeProfile FAILED at 2024-08-31 15:54:29.33144 -0700 PDT m=+2966.557160865
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p first-785000 -n first-785000
helpers_test.go:240: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p first-785000 -n first-785000: exit status 6 (145.238278ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E0831 15:54:29.464917    4500 status.go:417] kubeconfig endpoint: get endpoint: "first-785000" does not appear in /Users/jenkins/minikube-integration/18943-957/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:240: status error: exit status 6 (may be ok)
helpers_test.go:242: "first-785000" host is not running, skipping log retrieval (state="Running\nWARNING: Your kubectl is pointing to stale minikube-vm.\nTo fix the kubectl context, run `minikube update-context`")
helpers_test.go:176: Cleaning up "first-785000" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 delete -p first-785000
helpers_test.go:179: (dbg) Done: out/minikube-darwin-amd64 delete -p first-785000: (5.373843048s)
--- FAIL: TestMinikubeProfile (86.38s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountFirst (136.73s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountFirst
mount_start_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-1-037000 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=hyperkit 
mount_start_test.go:98: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p mount-start-1-037000 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=hyperkit : exit status 80 (2m16.645074002s)

                                                
                                                
-- stdout --
	* [mount-start-1-037000] minikube v1.33.1 on Darwin 14.6.1
	  - MINIKUBE_LOCATION=18943
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting minikube without Kubernetes in cluster mount-start-1-037000
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	* Deleting "mount-start-1-037000" in hyperkit ...
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 3e:b9:39:e8:db:2e
	* Failed to start hyperkit VM. Running "minikube delete -p mount-start-1-037000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for a6:28:c0:7:cc:7b
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for a6:28:c0:7:cc:7b
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
mount_start_test.go:100: failed to start minikube with args: "out/minikube-darwin-amd64 start -p mount-start-1-037000 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=hyperkit " : exit status 80
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p mount-start-1-037000 -n mount-start-1-037000
helpers_test.go:240: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p mount-start-1-037000 -n mount-start-1-037000: exit status 7 (85.712292ms)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0831 15:56:51.580321    4552 status.go:352] failed to get driver ip: getting IP: IP address is not set
	E0831 15:56:51.580340    4552 status.go:249] status error: getting IP: IP address is not set

                                                
                                                
** /stderr **
helpers_test.go:240: status error: exit status 7 (may be ok)
helpers_test.go:242: "mount-start-1-037000" host is not running, skipping log retrieval (state="Error")
--- FAIL: TestMountStart/serial/StartWithMountFirst (136.73s)

                                                
                                    
x
+
TestMultiNode/serial/RestartKeepsNodes (95.17s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartKeepsNodes
multinode_test.go:314: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-957000
multinode_test.go:321: (dbg) Run:  out/minikube-darwin-amd64 stop -p multinode-957000
multinode_test.go:321: (dbg) Done: out/minikube-darwin-amd64 stop -p multinode-957000: (18.842414291s)
multinode_test.go:326: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-957000 --wait=true -v=8 --alsologtostderr
multinode_test.go:326: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p multinode-957000 --wait=true -v=8 --alsologtostderr: exit status 90 (1m16.04624336s)

                                                
                                                
-- stdout --
	* [multinode-957000] minikube v1.33.1 on Darwin 14.6.1
	  - MINIKUBE_LOCATION=18943
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on existing profile
	* Starting "multinode-957000" primary control-plane node in "multinode-957000" cluster
	* Restarting existing hyperkit VM for "multinode-957000" ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 16:00:44.296063    5210 out.go:345] Setting OutFile to fd 1 ...
	I0831 16:00:44.296254    5210 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 16:00:44.296260    5210 out.go:358] Setting ErrFile to fd 2...
	I0831 16:00:44.296268    5210 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 16:00:44.296447    5210 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 16:00:44.297839    5210 out.go:352] Setting JSON to false
	I0831 16:00:44.320039    5210 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":3615,"bootTime":1725141629,"procs":436,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.6.1","kernelVersion":"23.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0831 16:00:44.320131    5210 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0831 16:00:44.342144    5210 out.go:177] * [multinode-957000] minikube v1.33.1 on Darwin 14.6.1
	I0831 16:00:44.384036    5210 out.go:177]   - MINIKUBE_LOCATION=18943
	I0831 16:00:44.384078    5210 notify.go:220] Checking for updates...
	I0831 16:00:44.426655    5210 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 16:00:44.448052    5210 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0831 16:00:44.468785    5210 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0831 16:00:44.490011    5210 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 16:00:44.510964    5210 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0831 16:00:44.532684    5210 config.go:182] Loaded profile config "multinode-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 16:00:44.532846    5210 driver.go:392] Setting default libvirt URI to qemu:///system
	I0831 16:00:44.533532    5210 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:00:44.533616    5210 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:00:44.543218    5210 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53047
	I0831 16:00:44.543594    5210 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:00:44.544002    5210 main.go:141] libmachine: Using API Version  1
	I0831 16:00:44.544010    5210 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:00:44.544216    5210 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:00:44.544336    5210 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:00:44.573149    5210 out.go:177] * Using the hyperkit driver based on existing profile
	I0831 16:00:44.614876    5210 start.go:297] selected driver: hyperkit
	I0831 16:00:44.614928    5210 start.go:901] validating driver "hyperkit" against &{Name:multinode-957000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig
:{KubernetesVersion:v1.31.0 ClusterName:multinode-957000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.13 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.14 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true} {Name:m03 IP:192.169.0.15 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime: ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:f
alse ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker Binary
Mirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 16:00:44.615168    5210 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0831 16:00:44.615353    5210 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 16:00:44.615553    5210 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18943-957/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0831 16:00:44.625316    5210 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0831 16:00:44.629355    5210 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:00:44.629387    5210 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0831 16:00:44.632100    5210 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 16:00:44.632168    5210 cni.go:84] Creating CNI manager for ""
	I0831 16:00:44.632176    5210 cni.go:136] multinode detected (3 nodes found), recommending kindnet
	I0831 16:00:44.632254    5210 start.go:340] cluster config:
	{Name:multinode-957000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 ClusterName:multinode-957000 Namespace:default APIServ
erHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.13 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.14 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true} {Name:m03 IP:192.169.0.15 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:
false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePa
th: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 16:00:44.632377    5210 iso.go:125] acquiring lock: {Name:mk6e91575b208577856769ef01f8e000bc57c787 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 16:00:44.675081    5210 out.go:177] * Starting "multinode-957000" primary control-plane node in "multinode-957000" cluster
	I0831 16:00:44.697908    5210 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 16:00:44.697977    5210 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4
	I0831 16:00:44.698008    5210 cache.go:56] Caching tarball of preloaded images
	I0831 16:00:44.698219    5210 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 16:00:44.698240    5210 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 16:00:44.698445    5210 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/config.json ...
	I0831 16:00:44.699064    5210 start.go:360] acquireMachinesLock for multinode-957000: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 16:00:44.699180    5210 start.go:364] duration metric: took 96.409µs to acquireMachinesLock for "multinode-957000"
	I0831 16:00:44.699203    5210 start.go:96] Skipping create...Using existing machine configuration
	I0831 16:00:44.699217    5210 fix.go:54] fixHost starting: 
	I0831 16:00:44.699524    5210 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:00:44.699551    5210 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:00:44.708517    5210 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53049
	I0831 16:00:44.708977    5210 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:00:44.709344    5210 main.go:141] libmachine: Using API Version  1
	I0831 16:00:44.709359    5210 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:00:44.709578    5210 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:00:44.709712    5210 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:00:44.709808    5210 main.go:141] libmachine: (multinode-957000) Calling .GetState
	I0831 16:00:44.709891    5210 main.go:141] libmachine: (multinode-957000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:00:44.709964    5210 main.go:141] libmachine: (multinode-957000) DBG | hyperkit pid from json: 4580
	I0831 16:00:44.710869    5210 main.go:141] libmachine: (multinode-957000) DBG | hyperkit pid 4580 missing from process table
	I0831 16:00:44.710924    5210 fix.go:112] recreateIfNeeded on multinode-957000: state=Stopped err=<nil>
	I0831 16:00:44.710943    5210 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	W0831 16:00:44.711030    5210 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 16:00:44.753022    5210 out.go:177] * Restarting existing hyperkit VM for "multinode-957000" ...
	I0831 16:00:44.775994    5210 main.go:141] libmachine: (multinode-957000) Calling .Start
	I0831 16:00:44.776297    5210 main.go:141] libmachine: (multinode-957000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:00:44.776382    5210 main.go:141] libmachine: (multinode-957000) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/hyperkit.pid
	I0831 16:00:44.778134    5210 main.go:141] libmachine: (multinode-957000) DBG | hyperkit pid 4580 missing from process table
	I0831 16:00:44.778162    5210 main.go:141] libmachine: (multinode-957000) DBG | pid 4580 is in state "Stopped"
	I0831 16:00:44.778179    5210 main.go:141] libmachine: (multinode-957000) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/hyperkit.pid...
	I0831 16:00:44.778350    5210 main.go:141] libmachine: (multinode-957000) DBG | Using UUID 0c4be3ea-664e-4524-9ddd-b85a2c6eb027
	I0831 16:00:44.897834    5210 main.go:141] libmachine: (multinode-957000) DBG | Generated MAC 52:11:67:f6:63:f1
	I0831 16:00:44.897863    5210 main.go:141] libmachine: (multinode-957000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=multinode-957000
	I0831 16:00:44.897990    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:44 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"0c4be3ea-664e-4524-9ddd-b85a2c6eb027", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003aa9c0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(
nil)}
	I0831 16:00:44.898023    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:44 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"0c4be3ea-664e-4524-9ddd-b85a2c6eb027", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003aa9c0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(
nil)}
	I0831 16:00:44.898089    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:44 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "0c4be3ea-664e-4524-9ddd-b85a2c6eb027", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/multinode-957000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/bzimage,/Users/jenkins/minikube-integration/18943-957/
.minikube/machines/multinode-957000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=multinode-957000"}
	I0831 16:00:44.898134    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:44 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 0c4be3ea-664e-4524-9ddd-b85a2c6eb027 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/multinode-957000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/initrd,earlyprintk=serial
loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=multinode-957000"
	I0831 16:00:44.898150    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:44 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 16:00:44.899690    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:44 DEBUG: hyperkit: Pid is 5222
	I0831 16:00:44.900173    5210 main.go:141] libmachine: (multinode-957000) DBG | Attempt 0
	I0831 16:00:44.900187    5210 main.go:141] libmachine: (multinode-957000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:00:44.900274    5210 main.go:141] libmachine: (multinode-957000) DBG | hyperkit pid from json: 5222
	I0831 16:00:44.901868    5210 main.go:141] libmachine: (multinode-957000) DBG | Searching for 52:11:67:f6:63:f1 in /var/db/dhcpd_leases ...
	I0831 16:00:44.901983    5210 main.go:141] libmachine: (multinode-957000) DBG | Found 14 entries in /var/db/dhcpd_leases!
	I0831 16:00:44.902000    5210 main.go:141] libmachine: (multinode-957000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a08a}
	I0831 16:00:44.902036    5210 main.go:141] libmachine: (multinode-957000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f17f}
	I0831 16:00:44.902051    5210 main.go:141] libmachine: (multinode-957000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f143}
	I0831 16:00:44.902060    5210 main.go:141] libmachine: (multinode-957000) DBG | Found match: 52:11:67:f6:63:f1
	I0831 16:00:44.902067    5210 main.go:141] libmachine: (multinode-957000) DBG | IP: 192.169.0.13
	I0831 16:00:44.902124    5210 main.go:141] libmachine: (multinode-957000) Calling .GetConfigRaw
	I0831 16:00:44.902928    5210 main.go:141] libmachine: (multinode-957000) Calling .GetIP
	I0831 16:00:44.903121    5210 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/config.json ...
	I0831 16:00:44.903600    5210 machine.go:93] provisionDockerMachine start ...
	I0831 16:00:44.903613    5210 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:00:44.903792    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:00:44.903924    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:00:44.904027    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:00:44.904153    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:00:44.904279    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:00:44.904398    5210 main.go:141] libmachine: Using SSH client type: native
	I0831 16:00:44.904623    5210 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xda3bea0] 0xda3ec00 <nil>  [] 0s} 192.169.0.13 22 <nil> <nil>}
	I0831 16:00:44.904632    5210 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 16:00:44.907871    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:44 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 16:00:44.965535    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:44 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 16:00:44.966228    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:44 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:00:44.966247    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:44 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:00:44.966254    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:44 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:00:44.966261    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:44 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:00:45.345208    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:45 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 16:00:45.345222    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:45 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 16:00:45.459704    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:45 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:00:45.459723    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:45 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:00:45.459737    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:45 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:00:45.459750    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:45 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:00:45.460650    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:45 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 16:00:45.460664    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:45 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 16:00:51.049959    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:51 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 16:00:51.049986    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:51 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 16:00:51.050002    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:51 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 16:00:51.074043    5210 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:00:51 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 16:00:55.973121    5210 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 16:00:55.973137    5210 main.go:141] libmachine: (multinode-957000) Calling .GetMachineName
	I0831 16:00:55.973282    5210 buildroot.go:166] provisioning hostname "multinode-957000"
	I0831 16:00:55.973295    5210 main.go:141] libmachine: (multinode-957000) Calling .GetMachineName
	I0831 16:00:55.973401    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:00:55.973491    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:00:55.973575    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:00:55.973669    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:00:55.973780    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:00:55.973910    5210 main.go:141] libmachine: Using SSH client type: native
	I0831 16:00:55.974077    5210 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xda3bea0] 0xda3ec00 <nil>  [] 0s} 192.169.0.13 22 <nil> <nil>}
	I0831 16:00:55.974086    5210 main.go:141] libmachine: About to run SSH command:
	sudo hostname multinode-957000 && echo "multinode-957000" | sudo tee /etc/hostname
	I0831 16:00:56.040841    5210 main.go:141] libmachine: SSH cmd err, output: <nil>: multinode-957000
	
	I0831 16:00:56.040875    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:00:56.041014    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:00:56.041138    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:00:56.041237    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:00:56.041323    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:00:56.041466    5210 main.go:141] libmachine: Using SSH client type: native
	I0831 16:00:56.041609    5210 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xda3bea0] 0xda3ec00 <nil>  [] 0s} 192.169.0.13 22 <nil> <nil>}
	I0831 16:00:56.041620    5210 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\smultinode-957000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 multinode-957000/g' /etc/hosts;
				else 
					echo '127.0.1.1 multinode-957000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 16:00:56.109560    5210 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 16:00:56.109581    5210 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 16:00:56.109598    5210 buildroot.go:174] setting up certificates
	I0831 16:00:56.109603    5210 provision.go:84] configureAuth start
	I0831 16:00:56.109609    5210 main.go:141] libmachine: (multinode-957000) Calling .GetMachineName
	I0831 16:00:56.109751    5210 main.go:141] libmachine: (multinode-957000) Calling .GetIP
	I0831 16:00:56.109869    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:00:56.109967    5210 provision.go:143] copyHostCerts
	I0831 16:00:56.109998    5210 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 16:00:56.110065    5210 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 16:00:56.110074    5210 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 16:00:56.110213    5210 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 16:00:56.110426    5210 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 16:00:56.110465    5210 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 16:00:56.110470    5210 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 16:00:56.110545    5210 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 16:00:56.110691    5210 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 16:00:56.110728    5210 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 16:00:56.110732    5210 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 16:00:56.110811    5210 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 16:00:56.110957    5210 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.multinode-957000 san=[127.0.0.1 192.169.0.13 localhost minikube multinode-957000]
	I0831 16:00:56.258816    5210 provision.go:177] copyRemoteCerts
	I0831 16:00:56.258871    5210 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 16:00:56.258890    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:00:56.259016    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:00:56.259110    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:00:56.259207    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:00:56.259308    5210 sshutil.go:53] new ssh client: &{IP:192.169.0.13 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/id_rsa Username:docker}
	I0831 16:00:56.296978    5210 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 16:00:56.297065    5210 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0831 16:00:56.315769    5210 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 16:00:56.315829    5210 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 16:00:56.334846    5210 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 16:00:56.334903    5210 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1216 bytes)
	I0831 16:00:56.353915    5210 provision.go:87] duration metric: took 244.297361ms to configureAuth
	I0831 16:00:56.353928    5210 buildroot.go:189] setting minikube options for container-runtime
	I0831 16:00:56.354107    5210 config.go:182] Loaded profile config "multinode-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 16:00:56.354137    5210 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:00:56.354277    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:00:56.354375    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:00:56.354454    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:00:56.354533    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:00:56.354608    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:00:56.354716    5210 main.go:141] libmachine: Using SSH client type: native
	I0831 16:00:56.354839    5210 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xda3bea0] 0xda3ec00 <nil>  [] 0s} 192.169.0.13 22 <nil> <nil>}
	I0831 16:00:56.354847    5210 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 16:00:56.414497    5210 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 16:00:56.414510    5210 buildroot.go:70] root file system type: tmpfs
	I0831 16:00:56.414578    5210 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 16:00:56.414592    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:00:56.414719    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:00:56.414819    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:00:56.414912    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:00:56.415012    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:00:56.415155    5210 main.go:141] libmachine: Using SSH client type: native
	I0831 16:00:56.415303    5210 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xda3bea0] 0xda3ec00 <nil>  [] 0s} 192.169.0.13 22 <nil> <nil>}
	I0831 16:00:56.415348    5210 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 16:00:56.485292    5210 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 16:00:56.485317    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:00:56.485456    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:00:56.485556    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:00:56.485636    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:00:56.485724    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:00:56.485864    5210 main.go:141] libmachine: Using SSH client type: native
	I0831 16:00:56.486015    5210 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xda3bea0] 0xda3ec00 <nil>  [] 0s} 192.169.0.13 22 <nil> <nil>}
	I0831 16:00:56.486026    5210 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 16:00:58.143455    5210 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 16:00:58.143470    5210 machine.go:96] duration metric: took 13.239782862s to provisionDockerMachine
	I0831 16:00:58.143482    5210 start.go:293] postStartSetup for "multinode-957000" (driver="hyperkit")
	I0831 16:00:58.143499    5210 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 16:00:58.143512    5210 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:00:58.143697    5210 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 16:00:58.143710    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:00:58.143805    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:00:58.143889    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:00:58.143973    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:00:58.144062    5210 sshutil.go:53] new ssh client: &{IP:192.169.0.13 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/id_rsa Username:docker}
	I0831 16:00:58.186315    5210 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 16:00:58.189594    5210 command_runner.go:130] > NAME=Buildroot
	I0831 16:00:58.189603    5210 command_runner.go:130] > VERSION=2023.02.9-dirty
	I0831 16:00:58.189607    5210 command_runner.go:130] > ID=buildroot
	I0831 16:00:58.189611    5210 command_runner.go:130] > VERSION_ID=2023.02.9
	I0831 16:00:58.189615    5210 command_runner.go:130] > PRETTY_NAME="Buildroot 2023.02.9"
	I0831 16:00:58.189681    5210 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 16:00:58.189694    5210 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 16:00:58.189795    5210 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 16:00:58.189971    5210 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 16:00:58.189978    5210 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 16:00:58.190181    5210 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 16:00:58.198140    5210 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 16:00:58.232239    5210 start.go:296] duration metric: took 88.748241ms for postStartSetup
	I0831 16:00:58.232265    5210 fix.go:56] duration metric: took 13.532974576s for fixHost
	I0831 16:00:58.232279    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:00:58.232421    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:00:58.232515    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:00:58.232615    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:00:58.232729    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:00:58.232869    5210 main.go:141] libmachine: Using SSH client type: native
	I0831 16:00:58.233024    5210 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xda3bea0] 0xda3ec00 <nil>  [] 0s} 192.169.0.13 22 <nil> <nil>}
	I0831 16:00:58.233031    5210 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 16:00:58.294610    5210 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725145258.402071171
	
	I0831 16:00:58.294621    5210 fix.go:216] guest clock: 1725145258.402071171
	I0831 16:00:58.294626    5210 fix.go:229] Guest: 2024-08-31 16:00:58.402071171 -0700 PDT Remote: 2024-08-31 16:00:58.232268 -0700 PDT m=+13.972183083 (delta=169.803171ms)
	I0831 16:00:58.294647    5210 fix.go:200] guest clock delta is within tolerance: 169.803171ms
	I0831 16:00:58.294652    5210 start.go:83] releasing machines lock for "multinode-957000", held for 13.595384248s
	I0831 16:00:58.294673    5210 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:00:58.294803    5210 main.go:141] libmachine: (multinode-957000) Calling .GetIP
	I0831 16:00:58.294896    5210 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:00:58.295244    5210 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:00:58.295352    5210 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:00:58.295442    5210 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 16:00:58.295472    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:00:58.295482    5210 ssh_runner.go:195] Run: cat /version.json
	I0831 16:00:58.295492    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:00:58.295560    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:00:58.295580    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:00:58.295674    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:00:58.295694    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:00:58.295778    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:00:58.295780    5210 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:00:58.295862    5210 sshutil.go:53] new ssh client: &{IP:192.169.0.13 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/id_rsa Username:docker}
	I0831 16:00:58.295898    5210 sshutil.go:53] new ssh client: &{IP:192.169.0.13 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/id_rsa Username:docker}
	I0831 16:00:58.374351    5210 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I0831 16:00:58.375363    5210 command_runner.go:130] > {"iso_version": "v1.33.1-1724862017-19530", "kicbase_version": "v0.0.44-1724775115-19521", "minikube_version": "v1.33.1", "commit": "0ce952d110f81b7b94ba20c385955675855b59fb"}
	I0831 16:00:58.375570    5210 ssh_runner.go:195] Run: systemctl --version
	I0831 16:00:58.380533    5210 command_runner.go:130] > systemd 252 (252)
	I0831 16:00:58.380556    5210 command_runner.go:130] > -PAM -AUDIT -SELINUX -APPARMOR -IMA -SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL -ELFUTILS -FIDO2 -IDN2 -IDN +IPTC +KMOD -LIBCRYPTSETUP +LIBFDISK -PCRE2 -PWQUALITY -P11KIT -QRENCODE -TPM2 -BZIP2 +LZ4 +XZ +ZLIB -ZSTD -BPF_FRAMEWORK -XKBCOMMON -UTMP -SYSVINIT default-hierarchy=unified
	I0831 16:00:58.380762    5210 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 16:00:58.385115    5210 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W0831 16:00:58.385141    5210 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 16:00:58.385176    5210 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 16:00:58.398488    5210 command_runner.go:139] > /etc/cni/net.d/87-podman-bridge.conflist, 
	I0831 16:00:58.398521    5210 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 16:00:58.398529    5210 start.go:495] detecting cgroup driver to use...
	I0831 16:00:58.398626    5210 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 16:00:58.413486    5210 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I0831 16:00:58.413837    5210 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 16:00:58.422684    5210 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 16:00:58.431630    5210 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 16:00:58.431672    5210 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 16:00:58.440419    5210 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 16:00:58.449144    5210 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 16:00:58.458049    5210 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 16:00:58.466855    5210 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 16:00:58.475598    5210 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 16:00:58.484585    5210 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 16:00:58.493300    5210 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 16:00:58.502206    5210 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 16:00:58.510045    5210 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I0831 16:00:58.510173    5210 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 16:00:58.518098    5210 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 16:00:58.612986    5210 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 16:00:58.632042    5210 start.go:495] detecting cgroup driver to use...
	I0831 16:00:58.632120    5210 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 16:00:58.652036    5210 command_runner.go:130] > # /usr/lib/systemd/system/docker.service
	I0831 16:00:58.653589    5210 command_runner.go:130] > [Unit]
	I0831 16:00:58.653597    5210 command_runner.go:130] > Description=Docker Application Container Engine
	I0831 16:00:58.653601    5210 command_runner.go:130] > Documentation=https://docs.docker.com
	I0831 16:00:58.653606    5210 command_runner.go:130] > After=network.target  minikube-automount.service docker.socket
	I0831 16:00:58.653634    5210 command_runner.go:130] > Requires= minikube-automount.service docker.socket 
	I0831 16:00:58.653642    5210 command_runner.go:130] > StartLimitBurst=3
	I0831 16:00:58.653658    5210 command_runner.go:130] > StartLimitIntervalSec=60
	I0831 16:00:58.653663    5210 command_runner.go:130] > [Service]
	I0831 16:00:58.653668    5210 command_runner.go:130] > Type=notify
	I0831 16:00:58.653674    5210 command_runner.go:130] > Restart=on-failure
	I0831 16:00:58.653682    5210 command_runner.go:130] > # This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	I0831 16:00:58.653698    5210 command_runner.go:130] > # The base configuration already specifies an 'ExecStart=...' command. The first directive
	I0831 16:00:58.653705    5210 command_runner.go:130] > # here is to clear out that command inherited from the base configuration. Without this,
	I0831 16:00:58.653711    5210 command_runner.go:130] > # the command from the base configuration and the command specified here are treated as
	I0831 16:00:58.653716    5210 command_runner.go:130] > # a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	I0831 16:00:58.653722    5210 command_runner.go:130] > # will catch this invalid input and refuse to start the service with an error like:
	I0831 16:00:58.653727    5210 command_runner.go:130] > #  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	I0831 16:00:58.653738    5210 command_runner.go:130] > # NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	I0831 16:00:58.653752    5210 command_runner.go:130] > # container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	I0831 16:00:58.653758    5210 command_runner.go:130] > ExecStart=
	I0831 16:00:58.653771    5210 command_runner.go:130] > ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	I0831 16:00:58.653776    5210 command_runner.go:130] > ExecReload=/bin/kill -s HUP $MAINPID
	I0831 16:00:58.653783    5210 command_runner.go:130] > # Having non-zero Limit*s causes performance problems due to accounting overhead
	I0831 16:00:58.653789    5210 command_runner.go:130] > # in the kernel. We recommend using cgroups to do container-local accounting.
	I0831 16:00:58.653795    5210 command_runner.go:130] > LimitNOFILE=infinity
	I0831 16:00:58.653800    5210 command_runner.go:130] > LimitNPROC=infinity
	I0831 16:00:58.653819    5210 command_runner.go:130] > LimitCORE=infinity
	I0831 16:00:58.653826    5210 command_runner.go:130] > # Uncomment TasksMax if your systemd version supports it.
	I0831 16:00:58.653831    5210 command_runner.go:130] > # Only systemd 226 and above support this version.
	I0831 16:00:58.653837    5210 command_runner.go:130] > TasksMax=infinity
	I0831 16:00:58.653843    5210 command_runner.go:130] > TimeoutStartSec=0
	I0831 16:00:58.653851    5210 command_runner.go:130] > # set delegate yes so that systemd does not reset the cgroups of docker containers
	I0831 16:00:58.653855    5210 command_runner.go:130] > Delegate=yes
	I0831 16:00:58.653860    5210 command_runner.go:130] > # kill only the docker process, not all processes in the cgroup
	I0831 16:00:58.653864    5210 command_runner.go:130] > KillMode=process
	I0831 16:00:58.653866    5210 command_runner.go:130] > [Install]
	I0831 16:00:58.653878    5210 command_runner.go:130] > WantedBy=multi-user.target
	I0831 16:00:58.653936    5210 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 16:00:58.665150    5210 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 16:00:58.682611    5210 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 16:00:58.693720    5210 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 16:00:58.704916    5210 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 16:00:58.726476    5210 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 16:00:58.736704    5210 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 16:00:58.751223    5210 command_runner.go:130] > runtime-endpoint: unix:///var/run/cri-dockerd.sock
	I0831 16:00:58.751533    5210 ssh_runner.go:195] Run: which cri-dockerd
	I0831 16:00:58.754304    5210 command_runner.go:130] > /usr/bin/cri-dockerd
	I0831 16:00:58.754477    5210 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 16:00:58.761809    5210 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 16:00:58.775390    5210 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 16:00:58.869891    5210 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 16:00:58.976355    5210 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 16:00:58.976430    5210 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 16:00:58.990499    5210 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 16:00:59.091301    5210 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 16:02:00.118743    5210 command_runner.go:130] ! Job for docker.service failed because the control process exited with error code.
	I0831 16:02:00.118757    5210 command_runner.go:130] ! See "systemctl status docker.service" and "journalctl -xeu docker.service" for details.
	I0831 16:02:00.118881    5210 ssh_runner.go:235] Completed: sudo systemctl restart docker: (1m1.027206224s)
	I0831 16:02:00.118947    5210 ssh_runner.go:195] Run: sudo journalctl --no-pager -u docker
	I0831 16:02:00.128879    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 systemd[1]: Starting Docker Application Container Engine...
	I0831 16:02:00.128891    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[485]: time="2024-08-31T23:00:56.917434958Z" level=info msg="Starting up"
	I0831 16:02:00.128901    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[485]: time="2024-08-31T23:00:56.918080414Z" level=info msg="containerd not running, starting managed containerd"
	I0831 16:02:00.128911    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[485]: time="2024-08-31T23:00:56.918638925Z" level=info msg="started new containerd process" address=/var/run/docker/containerd/containerd.sock module=libcontainerd pid=492
	I0831 16:02:00.128919    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.935620179Z" level=info msg="starting containerd" revision=472731909fa34bd7bc9c087e4c27943f9835f111 version=v1.7.21
	I0831 16:02:00.128929    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.950797945Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1
	I0831 16:02:00.128942    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.950859059Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1
	I0831 16:02:00.128951    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.950919747Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1
	I0831 16:02:00.128959    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.950955560Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1
	I0831 16:02:00.128969    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951116292Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1
	I0831 16:02:00.128979    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951169401Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1
	I0831 16:02:00.128997    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951298990Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	I0831 16:02:00.129006    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951339586Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1
	I0831 16:02:00.129017    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951371550Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1
	I0831 16:02:00.129026    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951477783Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1
	I0831 16:02:00.129035    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951642833Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1
	I0831 16:02:00.129045    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951867964Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1
	I0831 16:02:00.129063    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.953383840Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/5.10.207\\n\"): skip plugin" type=io.containerd.snapshotter.v1
	I0831 16:02:00.129072    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.953438955Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1
	I0831 16:02:00.129101    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.953570793Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	I0831 16:02:00.129111    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.953612629Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1
	I0831 16:02:00.129120    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.953795699Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1
	I0831 16:02:00.129130    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.953853631Z" level=info msg="metadata content store policy set" policy=shared
	I0831 16:02:00.129138    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.955849922Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1
	I0831 16:02:00.129148    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.955911861Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1
	I0831 16:02:00.129156    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.955947478Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1
	I0831 16:02:00.129164    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.955979080Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1
	I0831 16:02:00.129172    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956009649Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1
	I0831 16:02:00.129180    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956074450Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1
	I0831 16:02:00.129188    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956230378Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2
	I0831 16:02:00.129197    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956304823Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2
	I0831 16:02:00.129206    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956341342Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1
	I0831 16:02:00.129219    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956380529Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1
	I0831 16:02:00.129232    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956415349Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1
	I0831 16:02:00.129243    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956445589Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1
	I0831 16:02:00.129252    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956481097Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1
	I0831 16:02:00.129261    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956512132Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1
	I0831 16:02:00.129270    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956542672Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1
	I0831 16:02:00.129279    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956572327Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1
	I0831 16:02:00.129288    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956601375Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1
	I0831 16:02:00.129359    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956629767Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1
	I0831 16:02:00.129371    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956712781Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1
	I0831 16:02:00.129380    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956757634Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1
	I0831 16:02:00.129388    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956789866Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1
	I0831 16:02:00.129403    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956820602Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1
	I0831 16:02:00.129412    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956849955Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1
	I0831 16:02:00.129421    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956879121Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1
	I0831 16:02:00.129430    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956907574Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1
	I0831 16:02:00.129439    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956937245Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1
	I0831 16:02:00.129447    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956969410Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1
	I0831 16:02:00.129456    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957000462Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1
	I0831 16:02:00.129465    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957029117Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1
	I0831 16:02:00.129474    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957057994Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1
	I0831 16:02:00.129487    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957087467Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1
	I0831 16:02:00.129496    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957118894Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1
	I0831 16:02:00.129505    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957153165Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1
	I0831 16:02:00.129513    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957183791Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1
	I0831 16:02:00.129527    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957213323Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1
	I0831 16:02:00.129536    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957260983Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1
	I0831 16:02:00.129548    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957295662Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1
	I0831 16:02:00.129558    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957343487Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1
	I0831 16:02:00.129568    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957384441Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1
	I0831 16:02:00.129661    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957417919Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1
	I0831 16:02:00.129682    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957485048Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1
	I0831 16:02:00.129690    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957530455Z" level=info msg="NRI interface is disabled by configuration."
	I0831 16:02:00.129698    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957682617Z" level=info msg=serving... address=/var/run/docker/containerd/containerd-debug.sock
	I0831 16:02:00.129706    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957748255Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock.ttrpc
	I0831 16:02:00.129714    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957827995Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock
	I0831 16:02:00.129726    5210 command_runner.go:130] > Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957865145Z" level=info msg="containerd successfully booted in 0.023194s"
	I0831 16:02:00.129734    5210 command_runner.go:130] > Aug 31 23:00:57 multinode-957000 dockerd[485]: time="2024-08-31T23:00:57.944143184Z" level=info msg="[graphdriver] trying configured driver: overlay2"
	I0831 16:02:00.129748    5210 command_runner.go:130] > Aug 31 23:00:57 multinode-957000 dockerd[485]: time="2024-08-31T23:00:57.968714685Z" level=info msg="Loading containers: start."
	I0831 16:02:00.129768    5210 command_runner.go:130] > Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.111261675Z" level=warning msg="ip6tables is enabled, but cannot set up ip6tables chains" error="failed to create NAT chain DOCKER: iptables failed: ip6tables --wait -t nat -N DOCKER: ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)\nPerhaps ip6tables or your kernel needs to be upgraded.\n (exit status 3)"
	I0831 16:02:00.129779    5210 command_runner.go:130] > Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.171654844Z" level=info msg="Default bridge (docker0) is assigned with an IP address 172.17.0.0/16. Daemon option --bip can be used to set a preferred IP address"
	I0831 16:02:00.129794    5210 command_runner.go:130] > Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.217639585Z" level=warning msg="error locating sandbox id 7222dc654765c0f239f58d212687d3a644a65fb36bf96cb511d1f7535fefeaea: sandbox 7222dc654765c0f239f58d212687d3a644a65fb36bf96cb511d1f7535fefeaea not found"
	I0831 16:02:00.129802    5210 command_runner.go:130] > Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.217899910Z" level=info msg="Loading containers: done."
	I0831 16:02:00.129811    5210 command_runner.go:130] > Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.226601831Z" level=info msg="Docker daemon" commit=3ab5c7d0 containerd-snapshotter=false storage-driver=overlay2 version=27.2.0
	I0831 16:02:00.129819    5210 command_runner.go:130] > Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.226719981Z" level=info msg="Daemon has completed initialization"
	I0831 16:02:00.129826    5210 command_runner.go:130] > Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.248545960Z" level=info msg="API listen on /var/run/docker.sock"
	I0831 16:02:00.129833    5210 command_runner.go:130] > Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.248663159Z" level=info msg="API listen on [::]:2376"
	I0831 16:02:00.129839    5210 command_runner.go:130] > Aug 31 23:00:58 multinode-957000 systemd[1]: Started Docker Application Container Engine.
	I0831 16:02:00.129847    5210 command_runner.go:130] > Aug 31 23:00:59 multinode-957000 dockerd[485]: time="2024-08-31T23:00:59.210981789Z" level=info msg="Processing signal 'terminated'"
	I0831 16:02:00.129856    5210 command_runner.go:130] > Aug 31 23:00:59 multinode-957000 dockerd[485]: time="2024-08-31T23:00:59.211844790Z" level=info msg="stopping event stream following graceful shutdown" error="<nil>" module=libcontainerd namespace=moby
	I0831 16:02:00.129863    5210 command_runner.go:130] > Aug 31 23:00:59 multinode-957000 dockerd[485]: time="2024-08-31T23:00:59.212195525Z" level=info msg="Daemon shutdown complete"
	I0831 16:02:00.129874    5210 command_runner.go:130] > Aug 31 23:00:59 multinode-957000 dockerd[485]: time="2024-08-31T23:00:59.212291883Z" level=info msg="stopping event stream following graceful shutdown" error="context canceled" module=libcontainerd namespace=plugins.moby
	I0831 16:02:00.129885    5210 command_runner.go:130] > Aug 31 23:00:59 multinode-957000 dockerd[485]: time="2024-08-31T23:00:59.212293750Z" level=info msg="stopping healthcheck following graceful shutdown" module=libcontainerd
	I0831 16:02:00.129923    5210 command_runner.go:130] > Aug 31 23:00:59 multinode-957000 systemd[1]: Stopping Docker Application Container Engine...
	I0831 16:02:00.129930    5210 command_runner.go:130] > Aug 31 23:01:00 multinode-957000 systemd[1]: docker.service: Deactivated successfully.
	I0831 16:02:00.129935    5210 command_runner.go:130] > Aug 31 23:01:00 multinode-957000 systemd[1]: Stopped Docker Application Container Engine.
	I0831 16:02:00.129940    5210 command_runner.go:130] > Aug 31 23:01:00 multinode-957000 systemd[1]: Starting Docker Application Container Engine...
	I0831 16:02:00.129947    5210 command_runner.go:130] > Aug 31 23:01:00 multinode-957000 dockerd[911]: time="2024-08-31T23:01:00.253115667Z" level=info msg="Starting up"
	I0831 16:02:00.129956    5210 command_runner.go:130] > Aug 31 23:02:00 multinode-957000 dockerd[911]: failed to start daemon: failed to dial "/run/containerd/containerd.sock": failed to dial "/run/containerd/containerd.sock": context deadline exceeded
	I0831 16:02:00.129966    5210 command_runner.go:130] > Aug 31 23:02:00 multinode-957000 systemd[1]: docker.service: Main process exited, code=exited, status=1/FAILURE
	I0831 16:02:00.129972    5210 command_runner.go:130] > Aug 31 23:02:00 multinode-957000 systemd[1]: docker.service: Failed with result 'exit-code'.
	I0831 16:02:00.129979    5210 command_runner.go:130] > Aug 31 23:02:00 multinode-957000 systemd[1]: Failed to start Docker Application Container Engine.
	I0831 16:02:00.154868    5210 out.go:201] 
	W0831 16:02:00.176717    5210 out.go:270] X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart docker: Process exited with status 1
	stdout:
	
	stderr:
	Job for docker.service failed because the control process exited with error code.
	See "systemctl status docker.service" and "journalctl -xeu docker.service" for details.
	
	sudo journalctl --no-pager -u docker:
	-- stdout --
	Aug 31 23:00:56 multinode-957000 systemd[1]: Starting Docker Application Container Engine...
	Aug 31 23:00:56 multinode-957000 dockerd[485]: time="2024-08-31T23:00:56.917434958Z" level=info msg="Starting up"
	Aug 31 23:00:56 multinode-957000 dockerd[485]: time="2024-08-31T23:00:56.918080414Z" level=info msg="containerd not running, starting managed containerd"
	Aug 31 23:00:56 multinode-957000 dockerd[485]: time="2024-08-31T23:00:56.918638925Z" level=info msg="started new containerd process" address=/var/run/docker/containerd/containerd.sock module=libcontainerd pid=492
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.935620179Z" level=info msg="starting containerd" revision=472731909fa34bd7bc9c087e4c27943f9835f111 version=v1.7.21
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.950797945Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.950859059Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.950919747Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.950955560Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951116292Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951169401Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951298990Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951339586Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951371550Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951477783Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951642833Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951867964Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.953383840Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/5.10.207\\n\"): skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.953438955Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.953570793Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.953612629Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.953795699Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.953853631Z" level=info msg="metadata content store policy set" policy=shared
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.955849922Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.955911861Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.955947478Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.955979080Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956009649Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956074450Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956230378Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956304823Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956341342Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956380529Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956415349Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956445589Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956481097Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956512132Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956542672Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956572327Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956601375Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956629767Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956712781Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956757634Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956789866Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956820602Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956849955Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956879121Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956907574Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956937245Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956969410Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957000462Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957029117Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957057994Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957087467Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957118894Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957153165Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957183791Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957213323Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957260983Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957295662Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957343487Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957384441Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957417919Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957485048Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957530455Z" level=info msg="NRI interface is disabled by configuration."
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957682617Z" level=info msg=serving... address=/var/run/docker/containerd/containerd-debug.sock
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957748255Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock.ttrpc
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957827995Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957865145Z" level=info msg="containerd successfully booted in 0.023194s"
	Aug 31 23:00:57 multinode-957000 dockerd[485]: time="2024-08-31T23:00:57.944143184Z" level=info msg="[graphdriver] trying configured driver: overlay2"
	Aug 31 23:00:57 multinode-957000 dockerd[485]: time="2024-08-31T23:00:57.968714685Z" level=info msg="Loading containers: start."
	Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.111261675Z" level=warning msg="ip6tables is enabled, but cannot set up ip6tables chains" error="failed to create NAT chain DOCKER: iptables failed: ip6tables --wait -t nat -N DOCKER: ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)\nPerhaps ip6tables or your kernel needs to be upgraded.\n (exit status 3)"
	Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.171654844Z" level=info msg="Default bridge (docker0) is assigned with an IP address 172.17.0.0/16. Daemon option --bip can be used to set a preferred IP address"
	Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.217639585Z" level=warning msg="error locating sandbox id 7222dc654765c0f239f58d212687d3a644a65fb36bf96cb511d1f7535fefeaea: sandbox 7222dc654765c0f239f58d212687d3a644a65fb36bf96cb511d1f7535fefeaea not found"
	Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.217899910Z" level=info msg="Loading containers: done."
	Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.226601831Z" level=info msg="Docker daemon" commit=3ab5c7d0 containerd-snapshotter=false storage-driver=overlay2 version=27.2.0
	Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.226719981Z" level=info msg="Daemon has completed initialization"
	Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.248545960Z" level=info msg="API listen on /var/run/docker.sock"
	Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.248663159Z" level=info msg="API listen on [::]:2376"
	Aug 31 23:00:58 multinode-957000 systemd[1]: Started Docker Application Container Engine.
	Aug 31 23:00:59 multinode-957000 dockerd[485]: time="2024-08-31T23:00:59.210981789Z" level=info msg="Processing signal 'terminated'"
	Aug 31 23:00:59 multinode-957000 dockerd[485]: time="2024-08-31T23:00:59.211844790Z" level=info msg="stopping event stream following graceful shutdown" error="<nil>" module=libcontainerd namespace=moby
	Aug 31 23:00:59 multinode-957000 dockerd[485]: time="2024-08-31T23:00:59.212195525Z" level=info msg="Daemon shutdown complete"
	Aug 31 23:00:59 multinode-957000 dockerd[485]: time="2024-08-31T23:00:59.212291883Z" level=info msg="stopping event stream following graceful shutdown" error="context canceled" module=libcontainerd namespace=plugins.moby
	Aug 31 23:00:59 multinode-957000 dockerd[485]: time="2024-08-31T23:00:59.212293750Z" level=info msg="stopping healthcheck following graceful shutdown" module=libcontainerd
	Aug 31 23:00:59 multinode-957000 systemd[1]: Stopping Docker Application Container Engine...
	Aug 31 23:01:00 multinode-957000 systemd[1]: docker.service: Deactivated successfully.
	Aug 31 23:01:00 multinode-957000 systemd[1]: Stopped Docker Application Container Engine.
	Aug 31 23:01:00 multinode-957000 systemd[1]: Starting Docker Application Container Engine...
	Aug 31 23:01:00 multinode-957000 dockerd[911]: time="2024-08-31T23:01:00.253115667Z" level=info msg="Starting up"
	Aug 31 23:02:00 multinode-957000 dockerd[911]: failed to start daemon: failed to dial "/run/containerd/containerd.sock": failed to dial "/run/containerd/containerd.sock": context deadline exceeded
	Aug 31 23:02:00 multinode-957000 systemd[1]: docker.service: Main process exited, code=exited, status=1/FAILURE
	Aug 31 23:02:00 multinode-957000 systemd[1]: docker.service: Failed with result 'exit-code'.
	Aug 31 23:02:00 multinode-957000 systemd[1]: Failed to start Docker Application Container Engine.
	
	-- /stdout --
	X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart docker: Process exited with status 1
	stdout:
	
	stderr:
	Job for docker.service failed because the control process exited with error code.
	See "systemctl status docker.service" and "journalctl -xeu docker.service" for details.
	
	sudo journalctl --no-pager -u docker:
	-- stdout --
	Aug 31 23:00:56 multinode-957000 systemd[1]: Starting Docker Application Container Engine...
	Aug 31 23:00:56 multinode-957000 dockerd[485]: time="2024-08-31T23:00:56.917434958Z" level=info msg="Starting up"
	Aug 31 23:00:56 multinode-957000 dockerd[485]: time="2024-08-31T23:00:56.918080414Z" level=info msg="containerd not running, starting managed containerd"
	Aug 31 23:00:56 multinode-957000 dockerd[485]: time="2024-08-31T23:00:56.918638925Z" level=info msg="started new containerd process" address=/var/run/docker/containerd/containerd.sock module=libcontainerd pid=492
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.935620179Z" level=info msg="starting containerd" revision=472731909fa34bd7bc9c087e4c27943f9835f111 version=v1.7.21
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.950797945Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.950859059Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.950919747Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.950955560Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951116292Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951169401Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951298990Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951339586Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951371550Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951477783Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951642833Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.951867964Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.953383840Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/5.10.207\\n\"): skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.953438955Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.953570793Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.953612629Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.953795699Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.953853631Z" level=info msg="metadata content store policy set" policy=shared
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.955849922Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.955911861Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.955947478Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.955979080Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956009649Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956074450Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956230378Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956304823Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956341342Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956380529Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956415349Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956445589Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956481097Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956512132Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956542672Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956572327Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956601375Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956629767Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956712781Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956757634Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956789866Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956820602Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956849955Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956879121Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956907574Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956937245Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.956969410Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957000462Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957029117Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957057994Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957087467Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957118894Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957153165Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957183791Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957213323Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957260983Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957295662Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957343487Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957384441Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957417919Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957485048Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957530455Z" level=info msg="NRI interface is disabled by configuration."
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957682617Z" level=info msg=serving... address=/var/run/docker/containerd/containerd-debug.sock
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957748255Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock.ttrpc
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957827995Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock
	Aug 31 23:00:56 multinode-957000 dockerd[492]: time="2024-08-31T23:00:56.957865145Z" level=info msg="containerd successfully booted in 0.023194s"
	Aug 31 23:00:57 multinode-957000 dockerd[485]: time="2024-08-31T23:00:57.944143184Z" level=info msg="[graphdriver] trying configured driver: overlay2"
	Aug 31 23:00:57 multinode-957000 dockerd[485]: time="2024-08-31T23:00:57.968714685Z" level=info msg="Loading containers: start."
	Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.111261675Z" level=warning msg="ip6tables is enabled, but cannot set up ip6tables chains" error="failed to create NAT chain DOCKER: iptables failed: ip6tables --wait -t nat -N DOCKER: ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)\nPerhaps ip6tables or your kernel needs to be upgraded.\n (exit status 3)"
	Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.171654844Z" level=info msg="Default bridge (docker0) is assigned with an IP address 172.17.0.0/16. Daemon option --bip can be used to set a preferred IP address"
	Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.217639585Z" level=warning msg="error locating sandbox id 7222dc654765c0f239f58d212687d3a644a65fb36bf96cb511d1f7535fefeaea: sandbox 7222dc654765c0f239f58d212687d3a644a65fb36bf96cb511d1f7535fefeaea not found"
	Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.217899910Z" level=info msg="Loading containers: done."
	Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.226601831Z" level=info msg="Docker daemon" commit=3ab5c7d0 containerd-snapshotter=false storage-driver=overlay2 version=27.2.0
	Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.226719981Z" level=info msg="Daemon has completed initialization"
	Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.248545960Z" level=info msg="API listen on /var/run/docker.sock"
	Aug 31 23:00:58 multinode-957000 dockerd[485]: time="2024-08-31T23:00:58.248663159Z" level=info msg="API listen on [::]:2376"
	Aug 31 23:00:58 multinode-957000 systemd[1]: Started Docker Application Container Engine.
	Aug 31 23:00:59 multinode-957000 dockerd[485]: time="2024-08-31T23:00:59.210981789Z" level=info msg="Processing signal 'terminated'"
	Aug 31 23:00:59 multinode-957000 dockerd[485]: time="2024-08-31T23:00:59.211844790Z" level=info msg="stopping event stream following graceful shutdown" error="<nil>" module=libcontainerd namespace=moby
	Aug 31 23:00:59 multinode-957000 dockerd[485]: time="2024-08-31T23:00:59.212195525Z" level=info msg="Daemon shutdown complete"
	Aug 31 23:00:59 multinode-957000 dockerd[485]: time="2024-08-31T23:00:59.212291883Z" level=info msg="stopping event stream following graceful shutdown" error="context canceled" module=libcontainerd namespace=plugins.moby
	Aug 31 23:00:59 multinode-957000 dockerd[485]: time="2024-08-31T23:00:59.212293750Z" level=info msg="stopping healthcheck following graceful shutdown" module=libcontainerd
	Aug 31 23:00:59 multinode-957000 systemd[1]: Stopping Docker Application Container Engine...
	Aug 31 23:01:00 multinode-957000 systemd[1]: docker.service: Deactivated successfully.
	Aug 31 23:01:00 multinode-957000 systemd[1]: Stopped Docker Application Container Engine.
	Aug 31 23:01:00 multinode-957000 systemd[1]: Starting Docker Application Container Engine...
	Aug 31 23:01:00 multinode-957000 dockerd[911]: time="2024-08-31T23:01:00.253115667Z" level=info msg="Starting up"
	Aug 31 23:02:00 multinode-957000 dockerd[911]: failed to start daemon: failed to dial "/run/containerd/containerd.sock": failed to dial "/run/containerd/containerd.sock": context deadline exceeded
	Aug 31 23:02:00 multinode-957000 systemd[1]: docker.service: Main process exited, code=exited, status=1/FAILURE
	Aug 31 23:02:00 multinode-957000 systemd[1]: docker.service: Failed with result 'exit-code'.
	Aug 31 23:02:00 multinode-957000 systemd[1]: Failed to start Docker Application Container Engine.
	
	-- /stdout --
	W0831 16:02:00.176787    5210 out.go:270] * 
	* 
	W0831 16:02:00.178052    5210 out.go:293] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0831 16:02:00.242519    5210 out.go:201] 

                                                
                                                
** /stderr **
multinode_test.go:328: failed to run minikube start. args "out/minikube-darwin-amd64 node list -p multinode-957000" : exit status 90
multinode_test.go:331: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-957000
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p multinode-957000 -n multinode-957000
helpers_test.go:240: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p multinode-957000 -n multinode-957000: exit status 6 (145.197607ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E0831 16:02:00.497880    5244 status.go:417] kubeconfig endpoint: get endpoint: "multinode-957000" does not appear in /Users/jenkins/minikube-integration/18943-957/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:240: status error: exit status 6 (may be ok)
helpers_test.go:242: "multinode-957000" host is not running, skipping log retrieval (state="Running\nWARNING: Your kubectl is pointing to stale minikube-vm.\nTo fix the kubectl context, run `minikube update-context`")
--- FAIL: TestMultiNode/serial/RestartKeepsNodes (95.17s)

                                                
                                    
x
+
TestMultiNode/serial/DeleteNode (0.5s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeleteNode
multinode_test.go:416: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 node delete m03
multinode_test.go:416: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-957000 node delete m03: exit status 103 (184.587848ms)

                                                
                                                
-- stdout --
	* The control-plane node multinode-957000 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p multinode-957000"

                                                
                                                
-- /stdout --
multinode_test.go:418: node delete returned an error. args "out/minikube-darwin-amd64 -p multinode-957000 node delete m03": exit status 103
multinode_test.go:422: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 status --alsologtostderr
multinode_test.go:422: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-957000 status --alsologtostderr: exit status 7 (170.242801ms)

                                                
                                                
-- stdout --
	multinode-957000
	type: Control Plane
	host: Running
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Misconfigured
	
	
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`
	multinode-957000-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	
	multinode-957000-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 16:02:00.749370    5253 out.go:345] Setting OutFile to fd 1 ...
	I0831 16:02:00.749544    5253 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 16:02:00.749549    5253 out.go:358] Setting ErrFile to fd 2...
	I0831 16:02:00.749553    5253 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 16:02:00.749723    5253 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 16:02:00.749906    5253 out.go:352] Setting JSON to false
	I0831 16:02:00.749929    5253 mustload.go:65] Loading cluster: multinode-957000
	I0831 16:02:00.749965    5253 notify.go:220] Checking for updates...
	I0831 16:02:00.750232    5253 config.go:182] Loaded profile config "multinode-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 16:02:00.750250    5253 status.go:255] checking status of multinode-957000 ...
	I0831 16:02:00.750594    5253 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:02:00.750637    5253 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:02:00.759437    5253 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53088
	I0831 16:02:00.759827    5253 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:02:00.760243    5253 main.go:141] libmachine: Using API Version  1
	I0831 16:02:00.760252    5253 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:02:00.760481    5253 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:02:00.760633    5253 main.go:141] libmachine: (multinode-957000) Calling .GetState
	I0831 16:02:00.760738    5253 main.go:141] libmachine: (multinode-957000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:02:00.760792    5253 main.go:141] libmachine: (multinode-957000) DBG | hyperkit pid from json: 5222
	I0831 16:02:00.761749    5253 status.go:330] multinode-957000 host status = "Running" (err=<nil>)
	I0831 16:02:00.761772    5253 host.go:66] Checking if "multinode-957000" exists ...
	I0831 16:02:00.762021    5253 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:02:00.762045    5253 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:02:00.770442    5253 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53090
	I0831 16:02:00.770796    5253 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:02:00.771127    5253 main.go:141] libmachine: Using API Version  1
	I0831 16:02:00.771135    5253 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:02:00.771349    5253 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:02:00.771464    5253 main.go:141] libmachine: (multinode-957000) Calling .GetIP
	I0831 16:02:00.771550    5253 host.go:66] Checking if "multinode-957000" exists ...
	I0831 16:02:00.771805    5253 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:02:00.771833    5253 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:02:00.780141    5253 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53092
	I0831 16:02:00.780450    5253 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:02:00.780792    5253 main.go:141] libmachine: Using API Version  1
	I0831 16:02:00.780832    5253 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:02:00.781030    5253 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:02:00.781128    5253 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:02:00.781277    5253 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 16:02:00.781297    5253 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:02:00.781380    5253 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:02:00.781464    5253 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:02:00.781556    5253 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:02:00.781639    5253 sshutil.go:53] new ssh client: &{IP:192.169.0.13 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/id_rsa Username:docker}
	I0831 16:02:00.815011    5253 ssh_runner.go:195] Run: systemctl --version
	I0831 16:02:00.819187    5253 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	E0831 16:02:00.830658    5253 status.go:417] kubeconfig endpoint: get endpoint: "multinode-957000" does not appear in /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 16:02:00.830680    5253 api_server.go:166] Checking apiserver status ...
	I0831 16:02:00.830718    5253 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0831 16:02:00.841045    5253 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0831 16:02:00.841055    5253 status.go:422] multinode-957000 apiserver status = Stopped (err=<nil>)
	I0831 16:02:00.841065    5253 status.go:257] multinode-957000 status: &{Name:multinode-957000 Host:Running Kubelet:Stopped APIServer:Stopped Kubeconfig:Misconfigured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 16:02:00.841076    5253 status.go:255] checking status of multinode-957000-m02 ...
	I0831 16:02:00.841343    5253 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:02:00.841363    5253 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:02:00.850019    5253 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53095
	I0831 16:02:00.850355    5253 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:02:00.850679    5253 main.go:141] libmachine: Using API Version  1
	I0831 16:02:00.850689    5253 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:02:00.850912    5253 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:02:00.851030    5253 main.go:141] libmachine: (multinode-957000-m02) Calling .GetState
	I0831 16:02:00.851124    5253 main.go:141] libmachine: (multinode-957000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:02:00.851211    5253 main.go:141] libmachine: (multinode-957000-m02) DBG | hyperkit pid from json: 4597
	I0831 16:02:00.852128    5253 main.go:141] libmachine: (multinode-957000-m02) DBG | hyperkit pid 4597 missing from process table
	I0831 16:02:00.852171    5253 status.go:330] multinode-957000-m02 host status = "Stopped" (err=<nil>)
	I0831 16:02:00.852180    5253 status.go:343] host is not running, skipping remaining checks
	I0831 16:02:00.852187    5253 status.go:257] multinode-957000-m02 status: &{Name:multinode-957000-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I0831 16:02:00.852203    5253 status.go:255] checking status of multinode-957000-m03 ...
	I0831 16:02:00.852467    5253 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:02:00.852490    5253 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:02:00.860827    5253 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53097
	I0831 16:02:00.861164    5253 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:02:00.861503    5253 main.go:141] libmachine: Using API Version  1
	I0831 16:02:00.861516    5253 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:02:00.861779    5253 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:02:00.861939    5253 main.go:141] libmachine: (multinode-957000-m03) Calling .GetState
	I0831 16:02:00.862035    5253 main.go:141] libmachine: (multinode-957000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:02:00.862116    5253 main.go:141] libmachine: (multinode-957000-m03) DBG | hyperkit pid from json: 4887
	I0831 16:02:00.863028    5253 main.go:141] libmachine: (multinode-957000-m03) DBG | hyperkit pid 4887 missing from process table
	I0831 16:02:00.863048    5253 status.go:330] multinode-957000-m03 host status = "Stopped" (err=<nil>)
	I0831 16:02:00.863054    5253 status.go:343] host is not running, skipping remaining checks
	I0831 16:02:00.863060    5253 status.go:257] multinode-957000-m03 status: &{Name:multinode-957000-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
multinode_test.go:424: failed to run minikube status. args "out/minikube-darwin-amd64 -p multinode-957000 status --alsologtostderr" : exit status 7
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p multinode-957000 -n multinode-957000
helpers_test.go:240: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p multinode-957000 -n multinode-957000: exit status 6 (145.976057ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E0831 16:02:00.999836    5260 status.go:417] kubeconfig endpoint: get endpoint: "multinode-957000" does not appear in /Users/jenkins/minikube-integration/18943-957/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:240: status error: exit status 6 (may be ok)
helpers_test.go:242: "multinode-957000" host is not running, skipping log retrieval (state="Running\nWARNING: Your kubectl is pointing to stale minikube-vm.\nTo fix the kubectl context, run `minikube update-context`")
--- FAIL: TestMultiNode/serial/DeleteNode (0.50s)

                                                
                                    
x
+
TestMultiNode/serial/StopMultiNode (158.77s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopMultiNode
multinode_test.go:345: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 stop
E0831 16:02:52.712924    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:04:15.436004    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
multinode_test.go:345: (dbg) Done: out/minikube-darwin-amd64 -p multinode-957000 stop: (2m38.51749071s)
multinode_test.go:351: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 status
multinode_test.go:351: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-957000 status: exit status 7 (90.716376ms)

                                                
                                                
-- stdout --
	multinode-957000
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-957000-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	
	multinode-957000-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:358: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 status --alsologtostderr
multinode_test.go:358: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-957000 status --alsologtostderr: exit status 7 (89.842376ms)

                                                
                                                
-- stdout --
	multinode-957000
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-957000-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	
	multinode-957000-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 16:04:39.675320    5334 out.go:345] Setting OutFile to fd 1 ...
	I0831 16:04:39.675600    5334 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 16:04:39.675605    5334 out.go:358] Setting ErrFile to fd 2...
	I0831 16:04:39.675608    5334 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 16:04:39.675780    5334 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 16:04:39.675953    5334 out.go:352] Setting JSON to false
	I0831 16:04:39.675978    5334 mustload.go:65] Loading cluster: multinode-957000
	I0831 16:04:39.676015    5334 notify.go:220] Checking for updates...
	I0831 16:04:39.676271    5334 config.go:182] Loaded profile config "multinode-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 16:04:39.676287    5334 status.go:255] checking status of multinode-957000 ...
	I0831 16:04:39.676641    5334 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:04:39.676684    5334 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:04:39.685218    5334 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53124
	I0831 16:04:39.685654    5334 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:04:39.686061    5334 main.go:141] libmachine: Using API Version  1
	I0831 16:04:39.686073    5334 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:04:39.686284    5334 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:04:39.686408    5334 main.go:141] libmachine: (multinode-957000) Calling .GetState
	I0831 16:04:39.686496    5334 main.go:141] libmachine: (multinode-957000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:04:39.686568    5334 main.go:141] libmachine: (multinode-957000) DBG | hyperkit pid from json: 5222
	I0831 16:04:39.687464    5334 main.go:141] libmachine: (multinode-957000) DBG | hyperkit pid 5222 missing from process table
	I0831 16:04:39.687508    5334 status.go:330] multinode-957000 host status = "Stopped" (err=<nil>)
	I0831 16:04:39.687514    5334 status.go:343] host is not running, skipping remaining checks
	I0831 16:04:39.687521    5334 status.go:257] multinode-957000 status: &{Name:multinode-957000 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 16:04:39.687540    5334 status.go:255] checking status of multinode-957000-m02 ...
	I0831 16:04:39.687792    5334 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:04:39.687812    5334 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:04:39.696186    5334 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53126
	I0831 16:04:39.696529    5334 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:04:39.696898    5334 main.go:141] libmachine: Using API Version  1
	I0831 16:04:39.696928    5334 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:04:39.697150    5334 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:04:39.697269    5334 main.go:141] libmachine: (multinode-957000-m02) Calling .GetState
	I0831 16:04:39.697360    5334 main.go:141] libmachine: (multinode-957000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:04:39.697422    5334 main.go:141] libmachine: (multinode-957000-m02) DBG | hyperkit pid from json: 4597
	I0831 16:04:39.698314    5334 main.go:141] libmachine: (multinode-957000-m02) DBG | hyperkit pid 4597 missing from process table
	I0831 16:04:39.698353    5334 status.go:330] multinode-957000-m02 host status = "Stopped" (err=<nil>)
	I0831 16:04:39.698361    5334 status.go:343] host is not running, skipping remaining checks
	I0831 16:04:39.698373    5334 status.go:257] multinode-957000-m02 status: &{Name:multinode-957000-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I0831 16:04:39.698384    5334 status.go:255] checking status of multinode-957000-m03 ...
	I0831 16:04:39.698630    5334 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:04:39.698658    5334 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:04:39.707011    5334 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53128
	I0831 16:04:39.707320    5334 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:04:39.707668    5334 main.go:141] libmachine: Using API Version  1
	I0831 16:04:39.707686    5334 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:04:39.707900    5334 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:04:39.708010    5334 main.go:141] libmachine: (multinode-957000-m03) Calling .GetState
	I0831 16:04:39.708084    5334 main.go:141] libmachine: (multinode-957000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:04:39.708156    5334 main.go:141] libmachine: (multinode-957000-m03) DBG | hyperkit pid from json: 4887
	I0831 16:04:39.709069    5334 main.go:141] libmachine: (multinode-957000-m03) DBG | hyperkit pid 4887 missing from process table
	I0831 16:04:39.709091    5334 status.go:330] multinode-957000-m03 host status = "Stopped" (err=<nil>)
	I0831 16:04:39.709099    5334 status.go:343] host is not running, skipping remaining checks
	I0831 16:04:39.709106    5334 status.go:257] multinode-957000-m03 status: &{Name:multinode-957000-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
multinode_test.go:364: incorrect number of stopped hosts: args "out/minikube-darwin-amd64 -p multinode-957000 status --alsologtostderr": multinode-957000
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
multinode-957000-m02
type: Worker
host: Stopped
kubelet: Stopped

                                                
                                                
multinode-957000-m03
type: Worker
host: Stopped
kubelet: Stopped

                                                
                                                
multinode_test.go:368: incorrect number of stopped kubelets: args "out/minikube-darwin-amd64 -p multinode-957000 status --alsologtostderr": multinode-957000
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
multinode-957000-m02
type: Worker
host: Stopped
kubelet: Stopped

                                                
                                                
multinode-957000-m03
type: Worker
host: Stopped
kubelet: Stopped

                                                
                                                
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p multinode-957000 -n multinode-957000
helpers_test.go:240: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p multinode-957000 -n multinode-957000: exit status 7 (67.81319ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:240: status error: exit status 7 (may be ok)
helpers_test.go:242: "multinode-957000" host is not running, skipping log retrieval (state="Stopped")
--- FAIL: TestMultiNode/serial/StopMultiNode (158.77s)

                                                
                                    
x
+
TestMultiNode/serial/RestartMultiNode (189.56s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartMultiNode
multinode_test.go:376: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-957000 --wait=true -v=8 --alsologtostderr --driver=hyperkit 
E0831 16:05:55.791398    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
multinode_test.go:376: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-957000 --wait=true -v=8 --alsologtostderr --driver=hyperkit : (3m5.168055258s)
multinode_test.go:382: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 status --alsologtostderr
multinode_test.go:388: status says both hosts are not running: args "out/minikube-darwin-amd64 -p multinode-957000 status --alsologtostderr": 
-- stdout --
	multinode-957000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-957000-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-957000-m03
	type: Worker
	host: Running
	kubelet: Running
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 16:07:45.004526    5443 out.go:345] Setting OutFile to fd 1 ...
	I0831 16:07:45.005290    5443 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 16:07:45.005298    5443 out.go:358] Setting ErrFile to fd 2...
	I0831 16:07:45.005304    5443 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 16:07:45.005930    5443 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 16:07:45.006128    5443 out.go:352] Setting JSON to false
	I0831 16:07:45.006152    5443 mustload.go:65] Loading cluster: multinode-957000
	I0831 16:07:45.006189    5443 notify.go:220] Checking for updates...
	I0831 16:07:45.006471    5443 config.go:182] Loaded profile config "multinode-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 16:07:45.006486    5443 status.go:255] checking status of multinode-957000 ...
	I0831 16:07:45.006841    5443 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:45.006883    5443 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:45.015933    5443 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53242
	I0831 16:07:45.016245    5443 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:45.016647    5443 main.go:141] libmachine: Using API Version  1
	I0831 16:07:45.016657    5443 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:45.016895    5443 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:45.017003    5443 main.go:141] libmachine: (multinode-957000) Calling .GetState
	I0831 16:07:45.017081    5443 main.go:141] libmachine: (multinode-957000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:07:45.017157    5443 main.go:141] libmachine: (multinode-957000) DBG | hyperkit pid from json: 5355
	I0831 16:07:45.018134    5443 status.go:330] multinode-957000 host status = "Running" (err=<nil>)
	I0831 16:07:45.018155    5443 host.go:66] Checking if "multinode-957000" exists ...
	I0831 16:07:45.018395    5443 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:45.018417    5443 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:45.026735    5443 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53244
	I0831 16:07:45.027067    5443 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:45.027453    5443 main.go:141] libmachine: Using API Version  1
	I0831 16:07:45.027477    5443 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:45.027693    5443 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:45.027827    5443 main.go:141] libmachine: (multinode-957000) Calling .GetIP
	I0831 16:07:45.027913    5443 host.go:66] Checking if "multinode-957000" exists ...
	I0831 16:07:45.028165    5443 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:45.028191    5443 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:45.036644    5443 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53246
	I0831 16:07:45.036965    5443 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:45.037267    5443 main.go:141] libmachine: Using API Version  1
	I0831 16:07:45.037279    5443 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:45.037494    5443 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:45.037603    5443 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:07:45.037751    5443 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 16:07:45.037770    5443 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:07:45.037842    5443 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:07:45.037911    5443 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:07:45.037992    5443 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:07:45.038074    5443 sshutil.go:53] new ssh client: &{IP:192.169.0.13 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/id_rsa Username:docker}
	I0831 16:07:45.072184    5443 ssh_runner.go:195] Run: systemctl --version
	I0831 16:07:45.076528    5443 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 16:07:45.087210    5443 kubeconfig.go:125] found "multinode-957000" server: "https://192.169.0.13:8443"
	I0831 16:07:45.087233    5443 api_server.go:166] Checking apiserver status ...
	I0831 16:07:45.087276    5443 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 16:07:45.098032    5443 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1696/cgroup
	W0831 16:07:45.105374    5443 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1696/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 16:07:45.105421    5443 ssh_runner.go:195] Run: ls
	I0831 16:07:45.108528    5443 api_server.go:253] Checking apiserver healthz at https://192.169.0.13:8443/healthz ...
	I0831 16:07:45.111748    5443 api_server.go:279] https://192.169.0.13:8443/healthz returned 200:
	ok
	I0831 16:07:45.111759    5443 status.go:422] multinode-957000 apiserver status = Running (err=<nil>)
	I0831 16:07:45.111768    5443 status.go:257] multinode-957000 status: &{Name:multinode-957000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 16:07:45.111779    5443 status.go:255] checking status of multinode-957000-m02 ...
	I0831 16:07:45.112047    5443 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:45.112068    5443 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:45.120837    5443 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53250
	I0831 16:07:45.121186    5443 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:45.121528    5443 main.go:141] libmachine: Using API Version  1
	I0831 16:07:45.121543    5443 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:45.121777    5443 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:45.121892    5443 main.go:141] libmachine: (multinode-957000-m02) Calling .GetState
	I0831 16:07:45.121971    5443 main.go:141] libmachine: (multinode-957000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:07:45.122051    5443 main.go:141] libmachine: (multinode-957000-m02) DBG | hyperkit pid from json: 5398
	I0831 16:07:45.123030    5443 status.go:330] multinode-957000-m02 host status = "Running" (err=<nil>)
	I0831 16:07:45.123040    5443 host.go:66] Checking if "multinode-957000-m02" exists ...
	I0831 16:07:45.123280    5443 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:45.123318    5443 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:45.131769    5443 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53252
	I0831 16:07:45.132157    5443 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:45.132524    5443 main.go:141] libmachine: Using API Version  1
	I0831 16:07:45.132544    5443 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:45.132760    5443 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:45.132873    5443 main.go:141] libmachine: (multinode-957000-m02) Calling .GetIP
	I0831 16:07:45.132966    5443 host.go:66] Checking if "multinode-957000-m02" exists ...
	I0831 16:07:45.133237    5443 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:45.133259    5443 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:45.141945    5443 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53254
	I0831 16:07:45.142284    5443 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:45.142618    5443 main.go:141] libmachine: Using API Version  1
	I0831 16:07:45.142629    5443 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:45.142849    5443 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:45.142962    5443 main.go:141] libmachine: (multinode-957000-m02) Calling .DriverName
	I0831 16:07:45.143099    5443 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 16:07:45.143111    5443 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHHostname
	I0831 16:07:45.143199    5443 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHPort
	I0831 16:07:45.143284    5443 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHKeyPath
	I0831 16:07:45.143369    5443 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHUsername
	I0831 16:07:45.143438    5443 sshutil.go:53] new ssh client: &{IP:192.169.0.14 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/id_rsa Username:docker}
	I0831 16:07:45.172189    5443 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 16:07:45.182199    5443 status.go:257] multinode-957000-m02 status: &{Name:multinode-957000-m02 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I0831 16:07:45.182216    5443 status.go:255] checking status of multinode-957000-m03 ...
	I0831 16:07:45.182480    5443 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:45.182501    5443 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:45.191270    5443 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53257
	I0831 16:07:45.191614    5443 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:45.191966    5443 main.go:141] libmachine: Using API Version  1
	I0831 16:07:45.191981    5443 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:45.192182    5443 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:45.192305    5443 main.go:141] libmachine: (multinode-957000-m03) Calling .GetState
	I0831 16:07:45.192402    5443 main.go:141] libmachine: (multinode-957000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:07:45.192467    5443 main.go:141] libmachine: (multinode-957000-m03) DBG | hyperkit pid from json: 5425
	I0831 16:07:45.193454    5443 status.go:330] multinode-957000-m03 host status = "Running" (err=<nil>)
	I0831 16:07:45.193463    5443 host.go:66] Checking if "multinode-957000-m03" exists ...
	I0831 16:07:45.193707    5443 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:45.193731    5443 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:45.202234    5443 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53259
	I0831 16:07:45.202582    5443 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:45.202909    5443 main.go:141] libmachine: Using API Version  1
	I0831 16:07:45.202926    5443 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:45.203161    5443 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:45.203278    5443 main.go:141] libmachine: (multinode-957000-m03) Calling .GetIP
	I0831 16:07:45.203380    5443 host.go:66] Checking if "multinode-957000-m03" exists ...
	I0831 16:07:45.203633    5443 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:45.203655    5443 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:45.212150    5443 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53261
	I0831 16:07:45.212488    5443 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:45.212777    5443 main.go:141] libmachine: Using API Version  1
	I0831 16:07:45.212785    5443 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:45.212988    5443 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:45.213093    5443 main.go:141] libmachine: (multinode-957000-m03) Calling .DriverName
	I0831 16:07:45.213223    5443 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 16:07:45.213234    5443 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHHostname
	I0831 16:07:45.213316    5443 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHPort
	I0831 16:07:45.213396    5443 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHKeyPath
	I0831 16:07:45.213480    5443 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHUsername
	I0831 16:07:45.213565    5443 sshutil.go:53] new ssh client: &{IP:192.169.0.15 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/id_rsa Username:docker}
	I0831 16:07:45.246030    5443 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 16:07:45.257053    5443 status.go:257] multinode-957000-m03 status: &{Name:multinode-957000-m03 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
multinode_test.go:392: status says both kubelets are not running: args "out/minikube-darwin-amd64 -p multinode-957000 status --alsologtostderr": 
-- stdout --
	multinode-957000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-957000-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-957000-m03
	type: Worker
	host: Running
	kubelet: Running
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 16:07:45.004526    5443 out.go:345] Setting OutFile to fd 1 ...
	I0831 16:07:45.005290    5443 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 16:07:45.005298    5443 out.go:358] Setting ErrFile to fd 2...
	I0831 16:07:45.005304    5443 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 16:07:45.005930    5443 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 16:07:45.006128    5443 out.go:352] Setting JSON to false
	I0831 16:07:45.006152    5443 mustload.go:65] Loading cluster: multinode-957000
	I0831 16:07:45.006189    5443 notify.go:220] Checking for updates...
	I0831 16:07:45.006471    5443 config.go:182] Loaded profile config "multinode-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 16:07:45.006486    5443 status.go:255] checking status of multinode-957000 ...
	I0831 16:07:45.006841    5443 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:45.006883    5443 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:45.015933    5443 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53242
	I0831 16:07:45.016245    5443 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:45.016647    5443 main.go:141] libmachine: Using API Version  1
	I0831 16:07:45.016657    5443 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:45.016895    5443 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:45.017003    5443 main.go:141] libmachine: (multinode-957000) Calling .GetState
	I0831 16:07:45.017081    5443 main.go:141] libmachine: (multinode-957000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:07:45.017157    5443 main.go:141] libmachine: (multinode-957000) DBG | hyperkit pid from json: 5355
	I0831 16:07:45.018134    5443 status.go:330] multinode-957000 host status = "Running" (err=<nil>)
	I0831 16:07:45.018155    5443 host.go:66] Checking if "multinode-957000" exists ...
	I0831 16:07:45.018395    5443 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:45.018417    5443 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:45.026735    5443 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53244
	I0831 16:07:45.027067    5443 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:45.027453    5443 main.go:141] libmachine: Using API Version  1
	I0831 16:07:45.027477    5443 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:45.027693    5443 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:45.027827    5443 main.go:141] libmachine: (multinode-957000) Calling .GetIP
	I0831 16:07:45.027913    5443 host.go:66] Checking if "multinode-957000" exists ...
	I0831 16:07:45.028165    5443 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:45.028191    5443 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:45.036644    5443 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53246
	I0831 16:07:45.036965    5443 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:45.037267    5443 main.go:141] libmachine: Using API Version  1
	I0831 16:07:45.037279    5443 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:45.037494    5443 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:45.037603    5443 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:07:45.037751    5443 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 16:07:45.037770    5443 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:07:45.037842    5443 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:07:45.037911    5443 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:07:45.037992    5443 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:07:45.038074    5443 sshutil.go:53] new ssh client: &{IP:192.169.0.13 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/id_rsa Username:docker}
	I0831 16:07:45.072184    5443 ssh_runner.go:195] Run: systemctl --version
	I0831 16:07:45.076528    5443 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 16:07:45.087210    5443 kubeconfig.go:125] found "multinode-957000" server: "https://192.169.0.13:8443"
	I0831 16:07:45.087233    5443 api_server.go:166] Checking apiserver status ...
	I0831 16:07:45.087276    5443 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 16:07:45.098032    5443 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1696/cgroup
	W0831 16:07:45.105374    5443 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1696/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 16:07:45.105421    5443 ssh_runner.go:195] Run: ls
	I0831 16:07:45.108528    5443 api_server.go:253] Checking apiserver healthz at https://192.169.0.13:8443/healthz ...
	I0831 16:07:45.111748    5443 api_server.go:279] https://192.169.0.13:8443/healthz returned 200:
	ok
	I0831 16:07:45.111759    5443 status.go:422] multinode-957000 apiserver status = Running (err=<nil>)
	I0831 16:07:45.111768    5443 status.go:257] multinode-957000 status: &{Name:multinode-957000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 16:07:45.111779    5443 status.go:255] checking status of multinode-957000-m02 ...
	I0831 16:07:45.112047    5443 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:45.112068    5443 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:45.120837    5443 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53250
	I0831 16:07:45.121186    5443 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:45.121528    5443 main.go:141] libmachine: Using API Version  1
	I0831 16:07:45.121543    5443 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:45.121777    5443 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:45.121892    5443 main.go:141] libmachine: (multinode-957000-m02) Calling .GetState
	I0831 16:07:45.121971    5443 main.go:141] libmachine: (multinode-957000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:07:45.122051    5443 main.go:141] libmachine: (multinode-957000-m02) DBG | hyperkit pid from json: 5398
	I0831 16:07:45.123030    5443 status.go:330] multinode-957000-m02 host status = "Running" (err=<nil>)
	I0831 16:07:45.123040    5443 host.go:66] Checking if "multinode-957000-m02" exists ...
	I0831 16:07:45.123280    5443 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:45.123318    5443 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:45.131769    5443 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53252
	I0831 16:07:45.132157    5443 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:45.132524    5443 main.go:141] libmachine: Using API Version  1
	I0831 16:07:45.132544    5443 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:45.132760    5443 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:45.132873    5443 main.go:141] libmachine: (multinode-957000-m02) Calling .GetIP
	I0831 16:07:45.132966    5443 host.go:66] Checking if "multinode-957000-m02" exists ...
	I0831 16:07:45.133237    5443 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:45.133259    5443 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:45.141945    5443 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53254
	I0831 16:07:45.142284    5443 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:45.142618    5443 main.go:141] libmachine: Using API Version  1
	I0831 16:07:45.142629    5443 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:45.142849    5443 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:45.142962    5443 main.go:141] libmachine: (multinode-957000-m02) Calling .DriverName
	I0831 16:07:45.143099    5443 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 16:07:45.143111    5443 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHHostname
	I0831 16:07:45.143199    5443 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHPort
	I0831 16:07:45.143284    5443 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHKeyPath
	I0831 16:07:45.143369    5443 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHUsername
	I0831 16:07:45.143438    5443 sshutil.go:53] new ssh client: &{IP:192.169.0.14 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/id_rsa Username:docker}
	I0831 16:07:45.172189    5443 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 16:07:45.182199    5443 status.go:257] multinode-957000-m02 status: &{Name:multinode-957000-m02 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I0831 16:07:45.182216    5443 status.go:255] checking status of multinode-957000-m03 ...
	I0831 16:07:45.182480    5443 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:45.182501    5443 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:45.191270    5443 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53257
	I0831 16:07:45.191614    5443 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:45.191966    5443 main.go:141] libmachine: Using API Version  1
	I0831 16:07:45.191981    5443 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:45.192182    5443 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:45.192305    5443 main.go:141] libmachine: (multinode-957000-m03) Calling .GetState
	I0831 16:07:45.192402    5443 main.go:141] libmachine: (multinode-957000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:07:45.192467    5443 main.go:141] libmachine: (multinode-957000-m03) DBG | hyperkit pid from json: 5425
	I0831 16:07:45.193454    5443 status.go:330] multinode-957000-m03 host status = "Running" (err=<nil>)
	I0831 16:07:45.193463    5443 host.go:66] Checking if "multinode-957000-m03" exists ...
	I0831 16:07:45.193707    5443 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:45.193731    5443 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:45.202234    5443 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53259
	I0831 16:07:45.202582    5443 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:45.202909    5443 main.go:141] libmachine: Using API Version  1
	I0831 16:07:45.202926    5443 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:45.203161    5443 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:45.203278    5443 main.go:141] libmachine: (multinode-957000-m03) Calling .GetIP
	I0831 16:07:45.203380    5443 host.go:66] Checking if "multinode-957000-m03" exists ...
	I0831 16:07:45.203633    5443 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:45.203655    5443 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:45.212150    5443 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53261
	I0831 16:07:45.212488    5443 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:45.212777    5443 main.go:141] libmachine: Using API Version  1
	I0831 16:07:45.212785    5443 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:45.212988    5443 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:45.213093    5443 main.go:141] libmachine: (multinode-957000-m03) Calling .DriverName
	I0831 16:07:45.213223    5443 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 16:07:45.213234    5443 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHHostname
	I0831 16:07:45.213316    5443 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHPort
	I0831 16:07:45.213396    5443 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHKeyPath
	I0831 16:07:45.213480    5443 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHUsername
	I0831 16:07:45.213565    5443 sshutil.go:53] new ssh client: &{IP:192.169.0.15 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/id_rsa Username:docker}
	I0831 16:07:45.246030    5443 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 16:07:45.257053    5443 status.go:257] multinode-957000-m03 status: &{Name:multinode-957000-m03 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
multinode_test.go:396: (dbg) Run:  kubectl get nodes
multinode_test.go:404: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
multinode_test.go:409: expected 2 nodes Ready status to be True, got 
-- stdout --
	' True
	 True
	 True
	'

                                                
                                                
-- /stdout --
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p multinode-957000 -n multinode-957000
helpers_test.go:245: <<< TestMultiNode/serial/RestartMultiNode FAILED: start of post-mortem logs <<<
helpers_test.go:246: ======>  post-mortem[TestMultiNode/serial/RestartMultiNode]: minikube logs <======
helpers_test.go:248: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 logs -n 25
helpers_test.go:248: (dbg) Done: out/minikube-darwin-amd64 -p multinode-957000 logs -n 25: (3.007483315s)
helpers_test.go:253: TestMultiNode/serial/RestartMultiNode logs: 
-- stdout --
	
	==> Audit <==
	|---------|----------------------------------------------------------------------------------------------------------------------------|------------------|---------|---------|---------------------|---------------------|
	| Command |                                                            Args                                                            |     Profile      |  User   | Version |     Start Time      |      End Time       |
	|---------|----------------------------------------------------------------------------------------------------------------------------|------------------|---------|---------|---------------------|---------------------|
	| cp      | multinode-957000 cp multinode-957000-m02:/home/docker/cp-test.txt                                                          | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 15:59 PDT | 31 Aug 24 15:59 PDT |
	|         | multinode-957000:/home/docker/cp-test_multinode-957000-m02_multinode-957000.txt                                            |                  |         |         |                     |                     |
	| ssh     | multinode-957000 ssh -n                                                                                                    | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 15:59 PDT | 31 Aug 24 15:59 PDT |
	|         | multinode-957000-m02 sudo cat                                                                                              |                  |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                                                                   |                  |         |         |                     |                     |
	| ssh     | multinode-957000 ssh -n multinode-957000 sudo cat                                                                          | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 15:59 PDT | 31 Aug 24 15:59 PDT |
	|         | /home/docker/cp-test_multinode-957000-m02_multinode-957000.txt                                                             |                  |         |         |                     |                     |
	| cp      | multinode-957000 cp multinode-957000-m02:/home/docker/cp-test.txt                                                          | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 15:59 PDT | 31 Aug 24 15:59 PDT |
	|         | multinode-957000-m03:/home/docker/cp-test_multinode-957000-m02_multinode-957000-m03.txt                                    |                  |         |         |                     |                     |
	| ssh     | multinode-957000 ssh -n                                                                                                    | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 15:59 PDT | 31 Aug 24 15:59 PDT |
	|         | multinode-957000-m02 sudo cat                                                                                              |                  |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                                                                   |                  |         |         |                     |                     |
	| ssh     | multinode-957000 ssh -n multinode-957000-m03 sudo cat                                                                      | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 15:59 PDT | 31 Aug 24 15:59 PDT |
	|         | /home/docker/cp-test_multinode-957000-m02_multinode-957000-m03.txt                                                         |                  |         |         |                     |                     |
	| cp      | multinode-957000 cp testdata/cp-test.txt                                                                                   | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 15:59 PDT | 31 Aug 24 15:59 PDT |
	|         | multinode-957000-m03:/home/docker/cp-test.txt                                                                              |                  |         |         |                     |                     |
	| ssh     | multinode-957000 ssh -n                                                                                                    | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 15:59 PDT | 31 Aug 24 15:59 PDT |
	|         | multinode-957000-m03 sudo cat                                                                                              |                  |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                                                                   |                  |         |         |                     |                     |
	| cp      | multinode-957000 cp multinode-957000-m03:/home/docker/cp-test.txt                                                          | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 15:59 PDT | 31 Aug 24 15:59 PDT |
	|         | /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestMultiNodeserialCopyFile749792849/001/cp-test_multinode-957000-m03.txt |                  |         |         |                     |                     |
	| ssh     | multinode-957000 ssh -n                                                                                                    | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 15:59 PDT | 31 Aug 24 15:59 PDT |
	|         | multinode-957000-m03 sudo cat                                                                                              |                  |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                                                                   |                  |         |         |                     |                     |
	| cp      | multinode-957000 cp multinode-957000-m03:/home/docker/cp-test.txt                                                          | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 15:59 PDT | 31 Aug 24 15:59 PDT |
	|         | multinode-957000:/home/docker/cp-test_multinode-957000-m03_multinode-957000.txt                                            |                  |         |         |                     |                     |
	| ssh     | multinode-957000 ssh -n                                                                                                    | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 15:59 PDT | 31 Aug 24 15:59 PDT |
	|         | multinode-957000-m03 sudo cat                                                                                              |                  |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                                                                   |                  |         |         |                     |                     |
	| ssh     | multinode-957000 ssh -n multinode-957000 sudo cat                                                                          | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 15:59 PDT | 31 Aug 24 15:59 PDT |
	|         | /home/docker/cp-test_multinode-957000-m03_multinode-957000.txt                                                             |                  |         |         |                     |                     |
	| cp      | multinode-957000 cp multinode-957000-m03:/home/docker/cp-test.txt                                                          | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 15:59 PDT | 31 Aug 24 15:59 PDT |
	|         | multinode-957000-m02:/home/docker/cp-test_multinode-957000-m03_multinode-957000-m02.txt                                    |                  |         |         |                     |                     |
	| ssh     | multinode-957000 ssh -n                                                                                                    | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 15:59 PDT | 31 Aug 24 15:59 PDT |
	|         | multinode-957000-m03 sudo cat                                                                                              |                  |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                                                                   |                  |         |         |                     |                     |
	| ssh     | multinode-957000 ssh -n multinode-957000-m02 sudo cat                                                                      | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 15:59 PDT | 31 Aug 24 15:59 PDT |
	|         | /home/docker/cp-test_multinode-957000-m03_multinode-957000-m02.txt                                                         |                  |         |         |                     |                     |
	| node    | multinode-957000 node stop m03                                                                                             | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 15:59 PDT | 31 Aug 24 15:59 PDT |
	| node    | multinode-957000 node start                                                                                                | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 15:59 PDT | 31 Aug 24 16:00 PDT |
	|         | m03 -v=7 --alsologtostderr                                                                                                 |                  |         |         |                     |                     |
	| node    | list -p multinode-957000                                                                                                   | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 16:00 PDT |                     |
	| stop    | -p multinode-957000                                                                                                        | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 16:00 PDT | 31 Aug 24 16:00 PDT |
	| start   | -p multinode-957000                                                                                                        | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 16:00 PDT |                     |
	|         | --wait=true -v=8                                                                                                           |                  |         |         |                     |                     |
	|         | --alsologtostderr                                                                                                          |                  |         |         |                     |                     |
	| node    | list -p multinode-957000                                                                                                   | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 16:02 PDT |                     |
	| node    | multinode-957000 node delete                                                                                               | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 16:02 PDT |                     |
	|         | m03                                                                                                                        |                  |         |         |                     |                     |
	| stop    | multinode-957000 stop                                                                                                      | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 16:02 PDT | 31 Aug 24 16:04 PDT |
	| start   | -p multinode-957000                                                                                                        | multinode-957000 | jenkins | v1.33.1 | 31 Aug 24 16:04 PDT | 31 Aug 24 16:07 PDT |
	|         | --wait=true -v=8                                                                                                           |                  |         |         |                     |                     |
	|         | --alsologtostderr                                                                                                          |                  |         |         |                     |                     |
	|         | --driver=hyperkit                                                                                                          |                  |         |         |                     |                     |
	|---------|----------------------------------------------------------------------------------------------------------------------------|------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/08/31 16:04:39
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0831 16:04:39.833746    5342 out.go:345] Setting OutFile to fd 1 ...
	I0831 16:04:39.833932    5342 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 16:04:39.833938    5342 out.go:358] Setting ErrFile to fd 2...
	I0831 16:04:39.833941    5342 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 16:04:39.834116    5342 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 16:04:39.835521    5342 out.go:352] Setting JSON to false
	I0831 16:04:39.857678    5342 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":3850,"bootTime":1725141629,"procs":441,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.6.1","kernelVersion":"23.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0831 16:04:39.857765    5342 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0831 16:04:39.879973    5342 out.go:177] * [multinode-957000] minikube v1.33.1 on Darwin 14.6.1
	I0831 16:04:39.921897    5342 out.go:177]   - MINIKUBE_LOCATION=18943
	I0831 16:04:39.921971    5342 notify.go:220] Checking for updates...
	I0831 16:04:39.965508    5342 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 16:04:39.987899    5342 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0831 16:04:40.008723    5342 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0831 16:04:40.029887    5342 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 16:04:40.050757    5342 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0831 16:04:40.072387    5342 config.go:182] Loaded profile config "multinode-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 16:04:40.073087    5342 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:04:40.073171    5342 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:04:40.082947    5342 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53134
	I0831 16:04:40.083485    5342 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:04:40.084088    5342 main.go:141] libmachine: Using API Version  1
	I0831 16:04:40.084097    5342 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:04:40.084434    5342 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:04:40.084567    5342 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:04:40.084772    5342 driver.go:392] Setting default libvirt URI to qemu:///system
	I0831 16:04:40.085013    5342 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:04:40.085038    5342 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:04:40.093453    5342 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53136
	I0831 16:04:40.093901    5342 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:04:40.094336    5342 main.go:141] libmachine: Using API Version  1
	I0831 16:04:40.094367    5342 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:04:40.094648    5342 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:04:40.094853    5342 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:04:40.123927    5342 out.go:177] * Using the hyperkit driver based on existing profile
	I0831 16:04:40.165665    5342 start.go:297] selected driver: hyperkit
	I0831 16:04:40.165694    5342 start.go:901] validating driver "hyperkit" against &{Name:multinode-957000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig
:{KubernetesVersion:v1.31.0 ClusterName:multinode-957000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.13 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.14 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true} {Name:m03 IP:192.169.0.15 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ing
ress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker
BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 16:04:40.165953    5342 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0831 16:04:40.166131    5342 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 16:04:40.166330    5342 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18943-957/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0831 16:04:40.175974    5342 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0831 16:04:40.179775    5342 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:04:40.179800    5342 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0831 16:04:40.182412    5342 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 16:04:40.182452    5342 cni.go:84] Creating CNI manager for ""
	I0831 16:04:40.182460    5342 cni.go:136] multinode detected (3 nodes found), recommending kindnet
	I0831 16:04:40.182534    5342 start.go:340] cluster config:
	{Name:multinode-957000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 ClusterName:multinode-957000 Namespace:default APIServ
erHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.13 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.14 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true} {Name:m03 IP:192.169.0.15 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:
false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePa
th: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 16:04:40.182645    5342 iso.go:125] acquiring lock: {Name:mk6e91575b208577856769ef01f8e000bc57c787 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 16:04:40.224777    5342 out.go:177] * Starting "multinode-957000" primary control-plane node in "multinode-957000" cluster
	I0831 16:04:40.245720    5342 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 16:04:40.245789    5342 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4
	I0831 16:04:40.245816    5342 cache.go:56] Caching tarball of preloaded images
	I0831 16:04:40.246009    5342 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 16:04:40.246031    5342 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 16:04:40.246216    5342 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/config.json ...
	I0831 16:04:40.247139    5342 start.go:360] acquireMachinesLock for multinode-957000: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 16:04:40.247294    5342 start.go:364] duration metric: took 126.249µs to acquireMachinesLock for "multinode-957000"
	I0831 16:04:40.247327    5342 start.go:96] Skipping create...Using existing machine configuration
	I0831 16:04:40.247345    5342 fix.go:54] fixHost starting: 
	I0831 16:04:40.247786    5342 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:04:40.247824    5342 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:04:40.256752    5342 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53138
	I0831 16:04:40.257083    5342 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:04:40.257459    5342 main.go:141] libmachine: Using API Version  1
	I0831 16:04:40.257480    5342 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:04:40.257704    5342 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:04:40.257826    5342 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:04:40.257932    5342 main.go:141] libmachine: (multinode-957000) Calling .GetState
	I0831 16:04:40.258016    5342 main.go:141] libmachine: (multinode-957000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:04:40.258082    5342 main.go:141] libmachine: (multinode-957000) DBG | hyperkit pid from json: 5222
	I0831 16:04:40.259002    5342 main.go:141] libmachine: (multinode-957000) DBG | hyperkit pid 5222 missing from process table
	I0831 16:04:40.259026    5342 fix.go:112] recreateIfNeeded on multinode-957000: state=Stopped err=<nil>
	I0831 16:04:40.259042    5342 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	W0831 16:04:40.259121    5342 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 16:04:40.301881    5342 out.go:177] * Restarting existing hyperkit VM for "multinode-957000" ...
	I0831 16:04:40.323699    5342 main.go:141] libmachine: (multinode-957000) Calling .Start
	I0831 16:04:40.323983    5342 main.go:141] libmachine: (multinode-957000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:04:40.324016    5342 main.go:141] libmachine: (multinode-957000) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/hyperkit.pid
	I0831 16:04:40.325859    5342 main.go:141] libmachine: (multinode-957000) DBG | hyperkit pid 5222 missing from process table
	I0831 16:04:40.325877    5342 main.go:141] libmachine: (multinode-957000) DBG | pid 5222 is in state "Stopped"
	I0831 16:04:40.325894    5342 main.go:141] libmachine: (multinode-957000) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/hyperkit.pid...
	I0831 16:04:40.326245    5342 main.go:141] libmachine: (multinode-957000) DBG | Using UUID 0c4be3ea-664e-4524-9ddd-b85a2c6eb027
	I0831 16:04:40.434987    5342 main.go:141] libmachine: (multinode-957000) DBG | Generated MAC 52:11:67:f6:63:f1
	I0831 16:04:40.435014    5342 main.go:141] libmachine: (multinode-957000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=multinode-957000
	I0831 16:04:40.435133    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:40 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"0c4be3ea-664e-4524-9ddd-b85a2c6eb027", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003bec00)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(
nil)}
	I0831 16:04:40.435169    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:40 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"0c4be3ea-664e-4524-9ddd-b85a2c6eb027", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003bec00)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(
nil)}
	I0831 16:04:40.435201    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:40 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "0c4be3ea-664e-4524-9ddd-b85a2c6eb027", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/multinode-957000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/bzimage,/Users/jenkins/minikube-integration/18943-957/
.minikube/machines/multinode-957000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=multinode-957000"}
	I0831 16:04:40.435228    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:40 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 0c4be3ea-664e-4524-9ddd-b85a2c6eb027 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/multinode-957000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/initrd,earlyprintk=serial
loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=multinode-957000"
	I0831 16:04:40.435280    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:40 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 16:04:40.436831    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:40 DEBUG: hyperkit: Pid is 5355
	I0831 16:04:40.437272    5342 main.go:141] libmachine: (multinode-957000) DBG | Attempt 0
	I0831 16:04:40.437288    5342 main.go:141] libmachine: (multinode-957000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:04:40.437386    5342 main.go:141] libmachine: (multinode-957000) DBG | hyperkit pid from json: 5355
	I0831 16:04:40.438911    5342 main.go:141] libmachine: (multinode-957000) DBG | Searching for 52:11:67:f6:63:f1 in /var/db/dhcpd_leases ...
	I0831 16:04:40.438940    5342 main.go:141] libmachine: (multinode-957000) DBG | Found 14 entries in /var/db/dhcpd_leases!
	I0831 16:04:40.438977    5342 main.go:141] libmachine: (multinode-957000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f226}
	I0831 16:04:40.438988    5342 main.go:141] libmachine: (multinode-957000) DBG | Found match: 52:11:67:f6:63:f1
	I0831 16:04:40.438999    5342 main.go:141] libmachine: (multinode-957000) DBG | IP: 192.169.0.13
	I0831 16:04:40.439053    5342 main.go:141] libmachine: (multinode-957000) Calling .GetConfigRaw
	I0831 16:04:40.439804    5342 main.go:141] libmachine: (multinode-957000) Calling .GetIP
	I0831 16:04:40.440049    5342 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/config.json ...
	I0831 16:04:40.440664    5342 machine.go:93] provisionDockerMachine start ...
	I0831 16:04:40.440677    5342 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:04:40.440807    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:04:40.440907    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:04:40.440989    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:04:40.441084    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:04:40.441196    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:04:40.441362    5342 main.go:141] libmachine: Using SSH client type: native
	I0831 16:04:40.441574    5342 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1e70ea0] 0x1e73c00 <nil>  [] 0s} 192.169.0.13 22 <nil> <nil>}
	I0831 16:04:40.441584    5342 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 16:04:40.444704    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:40 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 16:04:40.496620    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:40 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 16:04:40.497289    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:40 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:04:40.497306    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:40 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:04:40.497313    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:40 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:04:40.497321    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:40 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:04:40.878579    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:40 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 16:04:40.878597    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:40 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 16:04:40.993688    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:40 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:04:40.993715    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:40 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:04:40.993731    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:40 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:04:40.993757    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:40 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:04:40.994653    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:40 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 16:04:40.994665    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:40 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 16:04:46.533007    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:46 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 16:04:46.533093    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:46 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 16:04:46.533104    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:46 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 16:04:46.556836    5342 main.go:141] libmachine: (multinode-957000) DBG | 2024/08/31 16:04:46 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 16:05:15.509006    5342 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 16:05:15.509022    5342 main.go:141] libmachine: (multinode-957000) Calling .GetMachineName
	I0831 16:05:15.509178    5342 buildroot.go:166] provisioning hostname "multinode-957000"
	I0831 16:05:15.509188    5342 main.go:141] libmachine: (multinode-957000) Calling .GetMachineName
	I0831 16:05:15.509303    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:05:15.509403    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:05:15.509503    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:05:15.509600    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:05:15.509695    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:05:15.509826    5342 main.go:141] libmachine: Using SSH client type: native
	I0831 16:05:15.509979    5342 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1e70ea0] 0x1e73c00 <nil>  [] 0s} 192.169.0.13 22 <nil> <nil>}
	I0831 16:05:15.509988    5342 main.go:141] libmachine: About to run SSH command:
	sudo hostname multinode-957000 && echo "multinode-957000" | sudo tee /etc/hostname
	I0831 16:05:15.580462    5342 main.go:141] libmachine: SSH cmd err, output: <nil>: multinode-957000
	
	I0831 16:05:15.580481    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:05:15.580623    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:05:15.580731    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:05:15.580815    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:05:15.580919    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:05:15.581057    5342 main.go:141] libmachine: Using SSH client type: native
	I0831 16:05:15.581209    5342 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1e70ea0] 0x1e73c00 <nil>  [] 0s} 192.169.0.13 22 <nil> <nil>}
	I0831 16:05:15.581220    5342 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\smultinode-957000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 multinode-957000/g' /etc/hosts;
				else 
					echo '127.0.1.1 multinode-957000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 16:05:15.647205    5342 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 16:05:15.647232    5342 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 16:05:15.647249    5342 buildroot.go:174] setting up certificates
	I0831 16:05:15.647255    5342 provision.go:84] configureAuth start
	I0831 16:05:15.647262    5342 main.go:141] libmachine: (multinode-957000) Calling .GetMachineName
	I0831 16:05:15.647396    5342 main.go:141] libmachine: (multinode-957000) Calling .GetIP
	I0831 16:05:15.647504    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:05:15.647603    5342 provision.go:143] copyHostCerts
	I0831 16:05:15.647628    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 16:05:15.647690    5342 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 16:05:15.647699    5342 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 16:05:15.647825    5342 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 16:05:15.648037    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 16:05:15.648078    5342 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 16:05:15.648083    5342 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 16:05:15.648165    5342 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 16:05:15.648304    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 16:05:15.648341    5342 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 16:05:15.648346    5342 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 16:05:15.648421    5342 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 16:05:15.648564    5342 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.multinode-957000 san=[127.0.0.1 192.169.0.13 localhost minikube multinode-957000]
	I0831 16:05:15.702007    5342 provision.go:177] copyRemoteCerts
	I0831 16:05:15.702062    5342 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 16:05:15.702078    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:05:15.702216    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:05:15.702310    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:05:15.702400    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:05:15.702491    5342 sshutil.go:53] new ssh client: &{IP:192.169.0.13 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/id_rsa Username:docker}
	I0831 16:05:15.740230    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 16:05:15.740308    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 16:05:15.759351    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 16:05:15.759411    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1216 bytes)
	I0831 16:05:15.778422    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 16:05:15.778487    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0831 16:05:15.797237    5342 provision.go:87] duration metric: took 149.967842ms to configureAuth
	I0831 16:05:15.797249    5342 buildroot.go:189] setting minikube options for container-runtime
	I0831 16:05:15.797420    5342 config.go:182] Loaded profile config "multinode-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 16:05:15.797433    5342 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:05:15.797565    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:05:15.797670    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:05:15.797736    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:05:15.797816    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:05:15.797899    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:05:15.798002    5342 main.go:141] libmachine: Using SSH client type: native
	I0831 16:05:15.798125    5342 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1e70ea0] 0x1e73c00 <nil>  [] 0s} 192.169.0.13 22 <nil> <nil>}
	I0831 16:05:15.798132    5342 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 16:05:15.857809    5342 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 16:05:15.857821    5342 buildroot.go:70] root file system type: tmpfs
	I0831 16:05:15.857903    5342 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 16:05:15.857918    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:05:15.858046    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:05:15.858138    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:05:15.858240    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:05:15.858332    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:05:15.858481    5342 main.go:141] libmachine: Using SSH client type: native
	I0831 16:05:15.858622    5342 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1e70ea0] 0x1e73c00 <nil>  [] 0s} 192.169.0.13 22 <nil> <nil>}
	I0831 16:05:15.858685    5342 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 16:05:15.929719    5342 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 16:05:15.929741    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:05:15.929869    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:05:15.929956    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:05:15.930039    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:05:15.930140    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:05:15.930285    5342 main.go:141] libmachine: Using SSH client type: native
	I0831 16:05:15.930431    5342 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1e70ea0] 0x1e73c00 <nil>  [] 0s} 192.169.0.13 22 <nil> <nil>}
	I0831 16:05:15.930443    5342 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 16:05:17.566401    5342 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 16:05:17.566420    5342 machine.go:96] duration metric: took 37.125527471s to provisionDockerMachine
	I0831 16:05:17.566430    5342 start.go:293] postStartSetup for "multinode-957000" (driver="hyperkit")
	I0831 16:05:17.566437    5342 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 16:05:17.566449    5342 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:05:17.566628    5342 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 16:05:17.566641    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:05:17.566727    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:05:17.566834    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:05:17.566937    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:05:17.567045    5342 sshutil.go:53] new ssh client: &{IP:192.169.0.13 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/id_rsa Username:docker}
	I0831 16:05:17.603974    5342 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 16:05:17.607038    5342 command_runner.go:130] > NAME=Buildroot
	I0831 16:05:17.607046    5342 command_runner.go:130] > VERSION=2023.02.9-dirty
	I0831 16:05:17.607049    5342 command_runner.go:130] > ID=buildroot
	I0831 16:05:17.607053    5342 command_runner.go:130] > VERSION_ID=2023.02.9
	I0831 16:05:17.607056    5342 command_runner.go:130] > PRETTY_NAME="Buildroot 2023.02.9"
	I0831 16:05:17.607186    5342 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 16:05:17.607197    5342 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 16:05:17.607293    5342 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 16:05:17.607471    5342 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 16:05:17.607477    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 16:05:17.607675    5342 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 16:05:17.614952    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 16:05:17.634766    5342 start.go:296] duration metric: took 68.327997ms for postStartSetup
	I0831 16:05:17.634790    5342 fix.go:56] duration metric: took 37.387234941s for fixHost
	I0831 16:05:17.634802    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:05:17.634950    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:05:17.635050    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:05:17.635133    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:05:17.635222    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:05:17.635338    5342 main.go:141] libmachine: Using SSH client type: native
	I0831 16:05:17.635483    5342 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1e70ea0] 0x1e73c00 <nil>  [] 0s} 192.169.0.13 22 <nil> <nil>}
	I0831 16:05:17.635490    5342 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 16:05:17.691503    5342 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725145517.802808551
	
	I0831 16:05:17.691514    5342 fix.go:216] guest clock: 1725145517.802808551
	I0831 16:05:17.691519    5342 fix.go:229] Guest: 2024-08-31 16:05:17.802808551 -0700 PDT Remote: 2024-08-31 16:05:17.634793 -0700 PDT m=+37.836482798 (delta=168.015551ms)
	I0831 16:05:17.691538    5342 fix.go:200] guest clock delta is within tolerance: 168.015551ms
	I0831 16:05:17.691541    5342 start.go:83] releasing machines lock for "multinode-957000", held for 37.444016289s
	I0831 16:05:17.691560    5342 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:05:17.691696    5342 main.go:141] libmachine: (multinode-957000) Calling .GetIP
	I0831 16:05:17.691787    5342 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:05:17.692102    5342 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:05:17.692200    5342 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:05:17.692274    5342 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 16:05:17.692308    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:05:17.692343    5342 ssh_runner.go:195] Run: cat /version.json
	I0831 16:05:17.692355    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:05:17.692395    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:05:17.692449    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:05:17.692475    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:05:17.692559    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:05:17.692569    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:05:17.692653    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:05:17.692664    5342 sshutil.go:53] new ssh client: &{IP:192.169.0.13 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/id_rsa Username:docker}
	I0831 16:05:17.692729    5342 sshutil.go:53] new ssh client: &{IP:192.169.0.13 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/id_rsa Username:docker}
	I0831 16:05:17.723905    5342 command_runner.go:130] > {"iso_version": "v1.33.1-1724862017-19530", "kicbase_version": "v0.0.44-1724775115-19521", "minikube_version": "v1.33.1", "commit": "0ce952d110f81b7b94ba20c385955675855b59fb"}
	I0831 16:05:17.724140    5342 ssh_runner.go:195] Run: systemctl --version
	I0831 16:05:17.775716    5342 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I0831 16:05:17.776578    5342 command_runner.go:130] > systemd 252 (252)
	I0831 16:05:17.776631    5342 command_runner.go:130] > -PAM -AUDIT -SELINUX -APPARMOR -IMA -SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL -ELFUTILS -FIDO2 -IDN2 -IDN +IPTC +KMOD -LIBCRYPTSETUP +LIBFDISK -PCRE2 -PWQUALITY -P11KIT -QRENCODE -TPM2 -BZIP2 +LZ4 +XZ +ZLIB -ZSTD -BPF_FRAMEWORK -XKBCOMMON -UTMP -SYSVINIT default-hierarchy=unified
	I0831 16:05:17.776748    5342 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 16:05:17.782037    5342 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W0831 16:05:17.782063    5342 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 16:05:17.782098    5342 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 16:05:17.794423    5342 command_runner.go:139] > /etc/cni/net.d/87-podman-bridge.conflist, 
	I0831 16:05:17.794588    5342 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 16:05:17.794598    5342 start.go:495] detecting cgroup driver to use...
	I0831 16:05:17.794717    5342 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 16:05:17.809769    5342 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I0831 16:05:17.810066    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 16:05:17.818339    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 16:05:17.826701    5342 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 16:05:17.826745    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 16:05:17.835102    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 16:05:17.843343    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 16:05:17.851434    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 16:05:17.859733    5342 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 16:05:17.868112    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 16:05:17.876518    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 16:05:17.884732    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 16:05:17.893078    5342 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 16:05:17.900536    5342 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I0831 16:05:17.900668    5342 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 16:05:17.908227    5342 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 16:05:18.003456    5342 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 16:05:18.019375    5342 start.go:495] detecting cgroup driver to use...
	I0831 16:05:18.019450    5342 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 16:05:18.030836    5342 command_runner.go:130] > # /usr/lib/systemd/system/docker.service
	I0831 16:05:18.031360    5342 command_runner.go:130] > [Unit]
	I0831 16:05:18.031368    5342 command_runner.go:130] > Description=Docker Application Container Engine
	I0831 16:05:18.031373    5342 command_runner.go:130] > Documentation=https://docs.docker.com
	I0831 16:05:18.031378    5342 command_runner.go:130] > After=network.target  minikube-automount.service docker.socket
	I0831 16:05:18.031383    5342 command_runner.go:130] > Requires= minikube-automount.service docker.socket 
	I0831 16:05:18.031389    5342 command_runner.go:130] > StartLimitBurst=3
	I0831 16:05:18.031394    5342 command_runner.go:130] > StartLimitIntervalSec=60
	I0831 16:05:18.031397    5342 command_runner.go:130] > [Service]
	I0831 16:05:18.031400    5342 command_runner.go:130] > Type=notify
	I0831 16:05:18.031403    5342 command_runner.go:130] > Restart=on-failure
	I0831 16:05:18.031417    5342 command_runner.go:130] > # This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	I0831 16:05:18.031425    5342 command_runner.go:130] > # The base configuration already specifies an 'ExecStart=...' command. The first directive
	I0831 16:05:18.031431    5342 command_runner.go:130] > # here is to clear out that command inherited from the base configuration. Without this,
	I0831 16:05:18.031437    5342 command_runner.go:130] > # the command from the base configuration and the command specified here are treated as
	I0831 16:05:18.031443    5342 command_runner.go:130] > # a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	I0831 16:05:18.031448    5342 command_runner.go:130] > # will catch this invalid input and refuse to start the service with an error like:
	I0831 16:05:18.031455    5342 command_runner.go:130] > #  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	I0831 16:05:18.031464    5342 command_runner.go:130] > # NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	I0831 16:05:18.031470    5342 command_runner.go:130] > # container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	I0831 16:05:18.031476    5342 command_runner.go:130] > ExecStart=
	I0831 16:05:18.031489    5342 command_runner.go:130] > ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	I0831 16:05:18.031493    5342 command_runner.go:130] > ExecReload=/bin/kill -s HUP $MAINPID
	I0831 16:05:18.031499    5342 command_runner.go:130] > # Having non-zero Limit*s causes performance problems due to accounting overhead
	I0831 16:05:18.031505    5342 command_runner.go:130] > # in the kernel. We recommend using cgroups to do container-local accounting.
	I0831 16:05:18.031509    5342 command_runner.go:130] > LimitNOFILE=infinity
	I0831 16:05:18.031512    5342 command_runner.go:130] > LimitNPROC=infinity
	I0831 16:05:18.031516    5342 command_runner.go:130] > LimitCORE=infinity
	I0831 16:05:18.031523    5342 command_runner.go:130] > # Uncomment TasksMax if your systemd version supports it.
	I0831 16:05:18.031529    5342 command_runner.go:130] > # Only systemd 226 and above support this version.
	I0831 16:05:18.031535    5342 command_runner.go:130] > TasksMax=infinity
	I0831 16:05:18.031540    5342 command_runner.go:130] > TimeoutStartSec=0
	I0831 16:05:18.031549    5342 command_runner.go:130] > # set delegate yes so that systemd does not reset the cgroups of docker containers
	I0831 16:05:18.031553    5342 command_runner.go:130] > Delegate=yes
	I0831 16:05:18.031557    5342 command_runner.go:130] > # kill only the docker process, not all processes in the cgroup
	I0831 16:05:18.031562    5342 command_runner.go:130] > KillMode=process
	I0831 16:05:18.031565    5342 command_runner.go:130] > [Install]
	I0831 16:05:18.031574    5342 command_runner.go:130] > WantedBy=multi-user.target
	I0831 16:05:18.031711    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 16:05:18.043707    5342 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 16:05:18.057290    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 16:05:18.068802    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 16:05:18.079648    5342 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 16:05:18.101634    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 16:05:18.111809    5342 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 16:05:18.126767    5342 command_runner.go:130] > runtime-endpoint: unix:///var/run/cri-dockerd.sock
	I0831 16:05:18.127021    5342 ssh_runner.go:195] Run: which cri-dockerd
	I0831 16:05:18.130009    5342 command_runner.go:130] > /usr/bin/cri-dockerd
	I0831 16:05:18.130128    5342 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 16:05:18.137312    5342 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 16:05:18.151012    5342 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 16:05:18.263122    5342 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 16:05:18.368346    5342 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 16:05:18.368436    5342 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 16:05:18.382153    5342 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 16:05:18.474903    5342 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 16:05:20.798359    5342 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.32342324s)
	I0831 16:05:20.798432    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 16:05:20.810171    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 16:05:20.820202    5342 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 16:05:20.916895    5342 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 16:05:21.018307    5342 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 16:05:21.111394    5342 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 16:05:21.125083    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 16:05:21.135283    5342 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 16:05:21.237027    5342 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 16:05:21.292953    5342 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 16:05:21.293037    5342 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 16:05:21.297764    5342 command_runner.go:130] >   File: /var/run/cri-dockerd.sock
	I0831 16:05:21.297776    5342 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I0831 16:05:21.297780    5342 command_runner.go:130] > Device: 0,22	Inode: 773         Links: 1
	I0831 16:05:21.297786    5342 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: ( 1000/  docker)
	I0831 16:05:21.297790    5342 command_runner.go:130] > Access: 2024-08-31 23:05:21.362047898 +0000
	I0831 16:05:21.297795    5342 command_runner.go:130] > Modify: 2024-08-31 23:05:21.362047898 +0000
	I0831 16:05:21.297800    5342 command_runner.go:130] > Change: 2024-08-31 23:05:21.363047898 +0000
	I0831 16:05:21.297802    5342 command_runner.go:130] >  Birth: -
	I0831 16:05:21.298008    5342 start.go:563] Will wait 60s for crictl version
	I0831 16:05:21.298055    5342 ssh_runner.go:195] Run: which crictl
	I0831 16:05:21.301336    5342 command_runner.go:130] > /usr/bin/crictl
	I0831 16:05:21.301495    5342 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 16:05:21.326927    5342 command_runner.go:130] > Version:  0.1.0
	I0831 16:05:21.326940    5342 command_runner.go:130] > RuntimeName:  docker
	I0831 16:05:21.326944    5342 command_runner.go:130] > RuntimeVersion:  27.2.0
	I0831 16:05:21.326949    5342 command_runner.go:130] > RuntimeApiVersion:  v1
	I0831 16:05:21.327930    5342 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 16:05:21.328006    5342 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 16:05:21.344898    5342 command_runner.go:130] > 27.2.0
	I0831 16:05:21.345897    5342 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 16:05:21.363592    5342 command_runner.go:130] > 27.2.0
	I0831 16:05:21.408735    5342 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 16:05:21.408789    5342 main.go:141] libmachine: (multinode-957000) Calling .GetIP
	I0831 16:05:21.409218    5342 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 16:05:21.413679    5342 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 16:05:21.423206    5342 kubeadm.go:883] updating cluster {Name:multinode-957000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion
:v1.31.0 ClusterName:multinode-957000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.13 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.14 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true} {Name:m03 IP:192.169.0.15 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-
dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: Disab
leOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0831 16:05:21.423288    5342 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 16:05:21.423351    5342 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 16:05:21.436058    5342 command_runner.go:130] > kindest/kindnetd:v20240813-c6f155d6
	I0831 16:05:21.436072    5342 command_runner.go:130] > registry.k8s.io/kube-apiserver:v1.31.0
	I0831 16:05:21.436076    5342 command_runner.go:130] > registry.k8s.io/kube-scheduler:v1.31.0
	I0831 16:05:21.436081    5342 command_runner.go:130] > registry.k8s.io/kube-controller-manager:v1.31.0
	I0831 16:05:21.436085    5342 command_runner.go:130] > registry.k8s.io/kube-proxy:v1.31.0
	I0831 16:05:21.436089    5342 command_runner.go:130] > registry.k8s.io/etcd:3.5.15-0
	I0831 16:05:21.436092    5342 command_runner.go:130] > registry.k8s.io/pause:3.10
	I0831 16:05:21.436107    5342 command_runner.go:130] > registry.k8s.io/coredns/coredns:v1.11.1
	I0831 16:05:21.436112    5342 command_runner.go:130] > gcr.io/k8s-minikube/storage-provisioner:v5
	I0831 16:05:21.436116    5342 command_runner.go:130] > gcr.io/k8s-minikube/busybox:1.28
	I0831 16:05:21.436645    5342 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240813-c6f155d6
	registry.k8s.io/kube-apiserver:v1.31.0
	registry.k8s.io/kube-scheduler:v1.31.0
	registry.k8s.io/kube-controller-manager:v1.31.0
	registry.k8s.io/kube-proxy:v1.31.0
	registry.k8s.io/etcd:3.5.15-0
	registry.k8s.io/pause:3.10
	registry.k8s.io/coredns/coredns:v1.11.1
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0831 16:05:21.436657    5342 docker.go:615] Images already preloaded, skipping extraction
	I0831 16:05:21.436727    5342 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0831 16:05:21.449957    5342 command_runner.go:130] > kindest/kindnetd:v20240813-c6f155d6
	I0831 16:05:21.449971    5342 command_runner.go:130] > registry.k8s.io/kube-controller-manager:v1.31.0
	I0831 16:05:21.449975    5342 command_runner.go:130] > registry.k8s.io/kube-apiserver:v1.31.0
	I0831 16:05:21.449979    5342 command_runner.go:130] > registry.k8s.io/kube-scheduler:v1.31.0
	I0831 16:05:21.449983    5342 command_runner.go:130] > registry.k8s.io/kube-proxy:v1.31.0
	I0831 16:05:21.449986    5342 command_runner.go:130] > registry.k8s.io/etcd:3.5.15-0
	I0831 16:05:21.449990    5342 command_runner.go:130] > registry.k8s.io/pause:3.10
	I0831 16:05:21.450002    5342 command_runner.go:130] > registry.k8s.io/coredns/coredns:v1.11.1
	I0831 16:05:21.450011    5342 command_runner.go:130] > gcr.io/k8s-minikube/storage-provisioner:v5
	I0831 16:05:21.450015    5342 command_runner.go:130] > gcr.io/k8s-minikube/busybox:1.28
	I0831 16:05:21.450526    5342 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240813-c6f155d6
	registry.k8s.io/kube-controller-manager:v1.31.0
	registry.k8s.io/kube-apiserver:v1.31.0
	registry.k8s.io/kube-scheduler:v1.31.0
	registry.k8s.io/kube-proxy:v1.31.0
	registry.k8s.io/etcd:3.5.15-0
	registry.k8s.io/pause:3.10
	registry.k8s.io/coredns/coredns:v1.11.1
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0831 16:05:21.450545    5342 cache_images.go:84] Images are preloaded, skipping loading
	I0831 16:05:21.450555    5342 kubeadm.go:934] updating node { 192.169.0.13 8443 v1.31.0 docker true true} ...
	I0831 16:05:21.450633    5342 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=multinode-957000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.13
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:multinode-957000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 16:05:21.450707    5342 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0831 16:05:21.486784    5342 command_runner.go:130] > cgroupfs
	I0831 16:05:21.487484    5342 cni.go:84] Creating CNI manager for ""
	I0831 16:05:21.487493    5342 cni.go:136] multinode detected (3 nodes found), recommending kindnet
	I0831 16:05:21.487503    5342 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0831 16:05:21.487517    5342 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.13 APIServerPort:8443 KubernetesVersion:v1.31.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:multinode-957000 NodeName:multinode-957000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.13"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.13 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc
/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0831 16:05:21.487605    5342 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.13
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "multinode-957000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.13
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.13"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.31.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0831 16:05:21.487672    5342 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 16:05:21.495150    5342 command_runner.go:130] > kubeadm
	I0831 16:05:21.495160    5342 command_runner.go:130] > kubectl
	I0831 16:05:21.495167    5342 command_runner.go:130] > kubelet
	I0831 16:05:21.495285    5342 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 16:05:21.495331    5342 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I0831 16:05:21.502443    5342 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (315 bytes)
	I0831 16:05:21.515894    5342 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 16:05:21.529010    5342 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2158 bytes)
	I0831 16:05:21.542705    5342 ssh_runner.go:195] Run: grep 192.169.0.13	control-plane.minikube.internal$ /etc/hosts
	I0831 16:05:21.545577    5342 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.13	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 16:05:21.554676    5342 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 16:05:21.653877    5342 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 16:05:21.669114    5342 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000 for IP: 192.169.0.13
	I0831 16:05:21.669126    5342 certs.go:194] generating shared ca certs ...
	I0831 16:05:21.669136    5342 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 16:05:21.669319    5342 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 16:05:21.669397    5342 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 16:05:21.669408    5342 certs.go:256] generating profile certs ...
	I0831 16:05:21.669525    5342 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/client.key
	I0831 16:05:21.669603    5342 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/apiserver.key.16050e8c
	I0831 16:05:21.669672    5342 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/proxy-client.key
	I0831 16:05:21.669679    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 16:05:21.669701    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 16:05:21.669719    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 16:05:21.669736    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 16:05:21.669754    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0831 16:05:21.669783    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0831 16:05:21.669812    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0831 16:05:21.669830    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0831 16:05:21.669927    5342 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 16:05:21.669973    5342 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 16:05:21.669981    5342 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 16:05:21.670018    5342 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 16:05:21.670052    5342 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 16:05:21.670090    5342 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 16:05:21.670156    5342 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 16:05:21.670192    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 16:05:21.670212    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 16:05:21.670234    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 16:05:21.670740    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 16:05:21.706386    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 16:05:21.730408    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 16:05:21.756405    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 16:05:21.780884    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I0831 16:05:21.800976    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0831 16:05:21.820593    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0831 16:05:21.840090    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0831 16:05:21.859610    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 16:05:21.879440    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 16:05:21.899510    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 16:05:21.919139    5342 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0831 16:05:21.932506    5342 ssh_runner.go:195] Run: openssl version
	I0831 16:05:21.936509    5342 command_runner.go:130] > OpenSSL 1.1.1w  11 Sep 2023
	I0831 16:05:21.936706    5342 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 16:05:21.944904    5342 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 16:05:21.948149    5342 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 16:05:21.948261    5342 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 16:05:21.948294    5342 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 16:05:21.952293    5342 command_runner.go:130] > b5213941
	I0831 16:05:21.952526    5342 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 16:05:21.960717    5342 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 16:05:21.968874    5342 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 16:05:21.972058    5342 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 16:05:21.972268    5342 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 16:05:21.972299    5342 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 16:05:21.976246    5342 command_runner.go:130] > 51391683
	I0831 16:05:21.976446    5342 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 16:05:21.984648    5342 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 16:05:21.992823    5342 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 16:05:21.995989    5342 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 16:05:21.996119    5342 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 16:05:21.996154    5342 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 16:05:22.000138    5342 command_runner.go:130] > 3ec20f2e
	I0831 16:05:22.000318    5342 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 16:05:22.008543    5342 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 16:05:22.011716    5342 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 16:05:22.011726    5342 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I0831 16:05:22.011740    5342 command_runner.go:130] > Device: 253,1	Inode: 6289719     Links: 1
	I0831 16:05:22.011749    5342 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I0831 16:05:22.011754    5342 command_runner.go:130] > Access: 2024-08-31 22:57:22.882538015 +0000
	I0831 16:05:22.011759    5342 command_runner.go:130] > Modify: 2024-08-31 22:57:22.882538015 +0000
	I0831 16:05:22.011763    5342 command_runner.go:130] > Change: 2024-08-31 22:57:22.882538015 +0000
	I0831 16:05:22.011768    5342 command_runner.go:130] >  Birth: 2024-08-31 22:57:22.882538015 +0000
	I0831 16:05:22.011843    5342 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0831 16:05:22.016059    5342 command_runner.go:130] > Certificate will not expire
	I0831 16:05:22.016182    5342 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0831 16:05:22.020248    5342 command_runner.go:130] > Certificate will not expire
	I0831 16:05:22.020413    5342 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0831 16:05:22.024611    5342 command_runner.go:130] > Certificate will not expire
	I0831 16:05:22.024659    5342 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0831 16:05:22.028753    5342 command_runner.go:130] > Certificate will not expire
	I0831 16:05:22.028847    5342 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0831 16:05:22.033005    5342 command_runner.go:130] > Certificate will not expire
	I0831 16:05:22.033115    5342 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0831 16:05:22.037274    5342 command_runner.go:130] > Certificate will not expire
	I0831 16:05:22.037374    5342 kubeadm.go:392] StartCluster: {Name:multinode-957000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1
.31.0 ClusterName:multinode-957000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.13 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.14 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true} {Name:m03 IP:192.169.0.15 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns
:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableO
ptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 16:05:22.037492    5342 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0831 16:05:22.050318    5342 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0831 16:05:22.057756    5342 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I0831 16:05:22.057772    5342 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I0831 16:05:22.057779    5342 command_runner.go:130] > /var/lib/minikube/etcd:
	I0831 16:05:22.057783    5342 command_runner.go:130] > member
	I0831 16:05:22.057841    5342 kubeadm.go:408] found existing configuration files, will attempt cluster restart
	I0831 16:05:22.057851    5342 kubeadm.go:593] restartPrimaryControlPlane start ...
	I0831 16:05:22.057891    5342 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0831 16:05:22.065047    5342 kubeadm.go:130] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0831 16:05:22.065364    5342 kubeconfig.go:47] verify endpoint returned: get endpoint: "multinode-957000" does not appear in /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 16:05:22.065452    5342 kubeconfig.go:62] /Users/jenkins/minikube-integration/18943-957/kubeconfig needs updating (will repair): [kubeconfig missing "multinode-957000" cluster setting kubeconfig missing "multinode-957000" context setting]
	I0831 16:05:22.065657    5342 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/kubeconfig: {Name:mkc7259a3f17d77b84078e55eed4ed8b5d2486ca Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 16:05:22.066390    5342 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 16:05:22.066588    5342 kapi.go:59] client config for multinode-957000: &rest.Config{Host:"https://192.169.0.13:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProt
os:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x352cc00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0831 16:05:22.066907    5342 cert_rotation.go:140] Starting client certificate rotation controller
	I0831 16:05:22.067092    5342 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0831 16:05:22.074072    5342 kubeadm.go:630] The running cluster does not require reconfiguration: 192.169.0.13
	I0831 16:05:22.074092    5342 kubeadm.go:1160] stopping kube-system containers ...
	I0831 16:05:22.074143    5342 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0831 16:05:22.088733    5342 command_runner.go:130] > 643a3abbab48
	I0831 16:05:22.088745    5342 command_runner.go:130] > 93e675b8bc50
	I0831 16:05:22.088749    5342 command_runner.go:130] > 634429fa66e1
	I0831 16:05:22.088752    5342 command_runner.go:130] > 718747f5c8c6
	I0831 16:05:22.088756    5342 command_runner.go:130] > 5960dead3edc
	I0831 16:05:22.088759    5342 command_runner.go:130] > d6ba988e6369
	I0831 16:05:22.088762    5342 command_runner.go:130] > ac20eb760f62
	I0831 16:05:22.088765    5342 command_runner.go:130] > 9469c6604c28
	I0831 16:05:22.088768    5342 command_runner.go:130] > 47934ef0bc6f
	I0831 16:05:22.088772    5342 command_runner.go:130] > 52037bd64f52
	I0831 16:05:22.088775    5342 command_runner.go:130] > 6a2eb4fcc96c
	I0831 16:05:22.088781    5342 command_runner.go:130] > b244e0b6607c
	I0831 16:05:22.088785    5342 command_runner.go:130] > 039c066f5489
	I0831 16:05:22.088789    5342 command_runner.go:130] > e020d44ad2a0
	I0831 16:05:22.088792    5342 command_runner.go:130] > eacabe17d95a
	I0831 16:05:22.088795    5342 command_runner.go:130] > d1171a7cb88a
	I0831 16:05:22.089502    5342 docker.go:483] Stopping containers: [643a3abbab48 93e675b8bc50 634429fa66e1 718747f5c8c6 5960dead3edc d6ba988e6369 ac20eb760f62 9469c6604c28 47934ef0bc6f 52037bd64f52 6a2eb4fcc96c b244e0b6607c 039c066f5489 e020d44ad2a0 eacabe17d95a d1171a7cb88a]
	I0831 16:05:22.089570    5342 ssh_runner.go:195] Run: docker stop 643a3abbab48 93e675b8bc50 634429fa66e1 718747f5c8c6 5960dead3edc d6ba988e6369 ac20eb760f62 9469c6604c28 47934ef0bc6f 52037bd64f52 6a2eb4fcc96c b244e0b6607c 039c066f5489 e020d44ad2a0 eacabe17d95a d1171a7cb88a
	I0831 16:05:22.104167    5342 command_runner.go:130] > 643a3abbab48
	I0831 16:05:22.104179    5342 command_runner.go:130] > 93e675b8bc50
	I0831 16:05:22.104183    5342 command_runner.go:130] > 634429fa66e1
	I0831 16:05:22.104191    5342 command_runner.go:130] > 718747f5c8c6
	I0831 16:05:22.104195    5342 command_runner.go:130] > 5960dead3edc
	I0831 16:05:22.104198    5342 command_runner.go:130] > d6ba988e6369
	I0831 16:05:22.104201    5342 command_runner.go:130] > ac20eb760f62
	I0831 16:05:22.104204    5342 command_runner.go:130] > 9469c6604c28
	I0831 16:05:22.104208    5342 command_runner.go:130] > 47934ef0bc6f
	I0831 16:05:22.104212    5342 command_runner.go:130] > 52037bd64f52
	I0831 16:05:22.104215    5342 command_runner.go:130] > 6a2eb4fcc96c
	I0831 16:05:22.104218    5342 command_runner.go:130] > b244e0b6607c
	I0831 16:05:22.104222    5342 command_runner.go:130] > 039c066f5489
	I0831 16:05:22.104225    5342 command_runner.go:130] > e020d44ad2a0
	I0831 16:05:22.104230    5342 command_runner.go:130] > eacabe17d95a
	I0831 16:05:22.104235    5342 command_runner.go:130] > d1171a7cb88a
	I0831 16:05:22.104346    5342 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I0831 16:05:22.116296    5342 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0831 16:05:22.123545    5342 command_runner.go:130] ! ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	I0831 16:05:22.123558    5342 command_runner.go:130] ! ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	I0831 16:05:22.123575    5342 command_runner.go:130] ! ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	I0831 16:05:22.123582    5342 command_runner.go:130] ! ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0831 16:05:22.123707    5342 kubeadm.go:155] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0831 16:05:22.123714    5342 kubeadm.go:157] found existing configuration files:
	
	I0831 16:05:22.123754    5342 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0831 16:05:22.130665    5342 command_runner.go:130] ! grep: /etc/kubernetes/admin.conf: No such file or directory
	I0831 16:05:22.130683    5342 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I0831 16:05:22.130730    5342 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I0831 16:05:22.138061    5342 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0831 16:05:22.145027    5342 command_runner.go:130] ! grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0831 16:05:22.145046    5342 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0831 16:05:22.145094    5342 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I0831 16:05:22.152205    5342 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0831 16:05:22.159005    5342 command_runner.go:130] ! grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0831 16:05:22.159025    5342 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0831 16:05:22.159058    5342 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0831 16:05:22.166212    5342 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0831 16:05:22.172968    5342 command_runner.go:130] ! grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0831 16:05:22.173048    5342 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0831 16:05:22.173104    5342 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0831 16:05:22.180209    5342 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0831 16:05:22.187506    5342 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I0831 16:05:22.256335    5342 command_runner.go:130] > [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0831 16:05:22.256516    5342 command_runner.go:130] > [certs] Using existing ca certificate authority
	I0831 16:05:22.256698    5342 command_runner.go:130] > [certs] Using existing apiserver certificate and key on disk
	I0831 16:05:22.256900    5342 command_runner.go:130] > [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I0831 16:05:22.257126    5342 command_runner.go:130] > [certs] Using existing front-proxy-ca certificate authority
	I0831 16:05:22.257309    5342 command_runner.go:130] > [certs] Using existing front-proxy-client certificate and key on disk
	I0831 16:05:22.257554    5342 command_runner.go:130] > [certs] Using existing etcd/ca certificate authority
	I0831 16:05:22.257733    5342 command_runner.go:130] > [certs] Using existing etcd/server certificate and key on disk
	I0831 16:05:22.257931    5342 command_runner.go:130] > [certs] Using existing etcd/peer certificate and key on disk
	I0831 16:05:22.258090    5342 command_runner.go:130] > [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I0831 16:05:22.258254    5342 command_runner.go:130] > [certs] Using existing apiserver-etcd-client certificate and key on disk
	I0831 16:05:22.258428    5342 command_runner.go:130] > [certs] Using the existing "sa" key
	I0831 16:05:22.259356    5342 command_runner.go:130] ! W0831 23:05:22.367750    1319 common.go:101] your configuration file uses a deprecated API spec: "kubeadm.k8s.io/v1beta3" (kind: "ClusterConfiguration"). Please use 'kubeadm config migrate --old-config old.yaml --new-config new.yaml', which will write the new, similar spec using a newer API version.
	I0831 16:05:22.259376    5342 command_runner.go:130] ! W0831 23:05:22.369141    1319 common.go:101] your configuration file uses a deprecated API spec: "kubeadm.k8s.io/v1beta3" (kind: "InitConfiguration"). Please use 'kubeadm config migrate --old-config old.yaml --new-config new.yaml', which will write the new, similar spec using a newer API version.
	I0831 16:05:22.259391    5342 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I0831 16:05:22.291268    5342 command_runner.go:130] > [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0831 16:05:22.413879    5342 command_runner.go:130] > [kubeconfig] Writing "admin.conf" kubeconfig file
	I0831 16:05:22.764394    5342 command_runner.go:130] > [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I0831 16:05:22.960757    5342 command_runner.go:130] > [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0831 16:05:23.017351    5342 command_runner.go:130] > [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0831 16:05:23.086796    5342 command_runner.go:130] > [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0831 16:05:23.088874    5342 command_runner.go:130] ! W0831 23:05:22.404360    1323 common.go:101] your configuration file uses a deprecated API spec: "kubeadm.k8s.io/v1beta3" (kind: "ClusterConfiguration"). Please use 'kubeadm config migrate --old-config old.yaml --new-config new.yaml', which will write the new, similar spec using a newer API version.
	I0831 16:05:23.088898    5342 command_runner.go:130] ! W0831 23:05:22.404853    1323 common.go:101] your configuration file uses a deprecated API spec: "kubeadm.k8s.io/v1beta3" (kind: "InitConfiguration"). Please use 'kubeadm config migrate --old-config old.yaml --new-config new.yaml', which will write the new, similar spec using a newer API version.
	I0831 16:05:23.088912    5342 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I0831 16:05:23.137007    5342 command_runner.go:130] > [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0831 16:05:23.141996    5342 command_runner.go:130] > [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0831 16:05:23.142035    5342 command_runner.go:130] > [kubelet-start] Starting the kubelet
	I0831 16:05:23.243428    5342 command_runner.go:130] ! W0831 23:05:23.237808    1328 common.go:101] your configuration file uses a deprecated API spec: "kubeadm.k8s.io/v1beta3" (kind: "ClusterConfiguration"). Please use 'kubeadm config migrate --old-config old.yaml --new-config new.yaml', which will write the new, similar spec using a newer API version.
	I0831 16:05:23.243449    5342 command_runner.go:130] ! W0831 23:05:23.238419    1328 common.go:101] your configuration file uses a deprecated API spec: "kubeadm.k8s.io/v1beta3" (kind: "InitConfiguration"). Please use 'kubeadm config migrate --old-config old.yaml --new-config new.yaml', which will write the new, similar spec using a newer API version.
	I0831 16:05:23.243467    5342 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I0831 16:05:23.303475    5342 command_runner.go:130] > [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0831 16:05:23.303496    5342 command_runner.go:130] > [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0831 16:05:23.314102    5342 command_runner.go:130] > [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0831 16:05:23.314698    5342 command_runner.go:130] > [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0831 16:05:23.317114    5342 command_runner.go:130] ! W0831 23:05:23.416463    1356 common.go:101] your configuration file uses a deprecated API spec: "kubeadm.k8s.io/v1beta3" (kind: "ClusterConfiguration"). Please use 'kubeadm config migrate --old-config old.yaml --new-config new.yaml', which will write the new, similar spec using a newer API version.
	I0831 16:05:23.317133    5342 command_runner.go:130] ! W0831 23:05:23.417015    1356 common.go:101] your configuration file uses a deprecated API spec: "kubeadm.k8s.io/v1beta3" (kind: "InitConfiguration"). Please use 'kubeadm config migrate --old-config old.yaml --new-config new.yaml', which will write the new, similar spec using a newer API version.
	I0831 16:05:23.317149    5342 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I0831 16:05:23.384683    5342 command_runner.go:130] > [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0831 16:05:23.390191    5342 command_runner.go:130] ! W0831 23:05:23.492556    1364 common.go:101] your configuration file uses a deprecated API spec: "kubeadm.k8s.io/v1beta3" (kind: "ClusterConfiguration"). Please use 'kubeadm config migrate --old-config old.yaml --new-config new.yaml', which will write the new, similar spec using a newer API version.
	I0831 16:05:23.390210    5342 command_runner.go:130] ! W0831 23:05:23.496069    1364 common.go:101] your configuration file uses a deprecated API spec: "kubeadm.k8s.io/v1beta3" (kind: "InitConfiguration"). Please use 'kubeadm config migrate --old-config old.yaml --new-config new.yaml', which will write the new, similar spec using a newer API version.
	I0831 16:05:23.390232    5342 api_server.go:52] waiting for apiserver process to appear ...
	I0831 16:05:23.390296    5342 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 16:05:23.891789    5342 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 16:05:24.390667    5342 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 16:05:24.401897    5342 command_runner.go:130] > 1696
	I0831 16:05:24.402080    5342 api_server.go:72] duration metric: took 1.011844604s to wait for apiserver process to appear ...
	I0831 16:05:24.402092    5342 api_server.go:88] waiting for apiserver healthz status ...
	I0831 16:05:24.402113    5342 api_server.go:253] Checking apiserver healthz at https://192.169.0.13:8443/healthz ...
	I0831 16:05:26.278143    5342 api_server.go:279] https://192.169.0.13:8443/healthz returned 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	W0831 16:05:26.278159    5342 api_server.go:103] status: https://192.169.0.13:8443/healthz returned error 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	I0831 16:05:26.278167    5342 api_server.go:253] Checking apiserver healthz at https://192.169.0.13:8443/healthz ...
	I0831 16:05:26.286617    5342 api_server.go:279] https://192.169.0.13:8443/healthz returned 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	W0831 16:05:26.286642    5342 api_server.go:103] status: https://192.169.0.13:8443/healthz returned error 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	I0831 16:05:26.402555    5342 api_server.go:253] Checking apiserver healthz at https://192.169.0.13:8443/healthz ...
	I0831 16:05:26.414094    5342 api_server.go:279] https://192.169.0.13:8443/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/bootstrap-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-status-local-available-controller ok
	[+]poststarthook/apiservice-status-remote-available-controller ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	W0831 16:05:26.414112    5342 api_server.go:103] status: https://192.169.0.13:8443/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/bootstrap-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-status-local-available-controller ok
	[+]poststarthook/apiservice-status-remote-available-controller ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	I0831 16:05:26.902378    5342 api_server.go:253] Checking apiserver healthz at https://192.169.0.13:8443/healthz ...
	I0831 16:05:26.907101    5342 api_server.go:279] https://192.169.0.13:8443/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/bootstrap-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-status-local-available-controller ok
	[+]poststarthook/apiservice-status-remote-available-controller ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	W0831 16:05:26.907116    5342 api_server.go:103] status: https://192.169.0.13:8443/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/bootstrap-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-status-local-available-controller ok
	[+]poststarthook/apiservice-status-remote-available-controller ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	I0831 16:05:27.402275    5342 api_server.go:253] Checking apiserver healthz at https://192.169.0.13:8443/healthz ...
	I0831 16:05:27.413106    5342 api_server.go:279] https://192.169.0.13:8443/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/bootstrap-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-status-local-available-controller ok
	[+]poststarthook/apiservice-status-remote-available-controller ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	W0831 16:05:27.413126    5342 api_server.go:103] status: https://192.169.0.13:8443/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/start-system-namespaces-controller ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-controller ok
	[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok
	[+]poststarthook/start-legacy-token-tracking-controller ok
	[+]poststarthook/start-service-ip-repair-controllers ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/bootstrap-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-status-local-available-controller ok
	[+]poststarthook/apiservice-status-remote-available-controller ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-discovery-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	I0831 16:05:27.903290    5342 api_server.go:253] Checking apiserver healthz at https://192.169.0.13:8443/healthz ...
	I0831 16:05:27.907124    5342 api_server.go:279] https://192.169.0.13:8443/healthz returned 200:
	ok
	I0831 16:05:27.907193    5342 round_trippers.go:463] GET https://192.169.0.13:8443/version
	I0831 16:05:27.907198    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:27.907207    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:27.907211    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:27.912550    5342 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0831 16:05:27.912560    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:27.912566    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:27.912570    5342 round_trippers.go:580]     Content-Length: 263
	I0831 16:05:27.912580    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:28 GMT
	I0831 16:05:27.912585    5342 round_trippers.go:580]     Audit-Id: 646566a3-3038-4dde-b14f-58eeb2f3eb47
	I0831 16:05:27.912588    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:27.912590    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:27.912593    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:27.912614    5342 request.go:1351] Response Body: {
	  "major": "1",
	  "minor": "31",
	  "gitVersion": "v1.31.0",
	  "gitCommit": "9edcffcde5595e8a5b1a35f88c421764e575afce",
	  "gitTreeState": "clean",
	  "buildDate": "2024-08-13T07:28:49Z",
	  "goVersion": "go1.22.5",
	  "compiler": "gc",
	  "platform": "linux/amd64"
	}
	I0831 16:05:27.912665    5342 api_server.go:141] control plane version: v1.31.0
	I0831 16:05:27.912676    5342 api_server.go:131] duration metric: took 3.510558835s to wait for apiserver health ...
	I0831 16:05:27.912682    5342 cni.go:84] Creating CNI manager for ""
	I0831 16:05:27.912688    5342 cni.go:136] multinode detected (3 nodes found), recommending kindnet
	I0831 16:05:27.936025    5342 out.go:177] * Configuring CNI (Container Networking Interface) ...
	I0831 16:05:27.974057    5342 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I0831 16:05:27.981108    5342 command_runner.go:130] >   File: /opt/cni/bin/portmap
	I0831 16:05:27.981123    5342 command_runner.go:130] >   Size: 2785880   	Blocks: 5448       IO Block: 4096   regular file
	I0831 16:05:27.981127    5342 command_runner.go:130] > Device: 0,17	Inode: 3500        Links: 1
	I0831 16:05:27.981132    5342 command_runner.go:130] > Access: (0755/-rwxr-xr-x)  Uid: (    0/    root)   Gid: (    0/    root)
	I0831 16:05:27.981160    5342 command_runner.go:130] > Access: 2024-08-31 23:04:49.728411249 +0000
	I0831 16:05:27.981168    5342 command_runner.go:130] > Modify: 2024-08-28 21:39:18.000000000 +0000
	I0831 16:05:27.981184    5342 command_runner.go:130] > Change: 2024-08-31 23:04:48.166411357 +0000
	I0831 16:05:27.981188    5342 command_runner.go:130] >  Birth: -
	I0831 16:05:27.981407    5342 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.31.0/kubectl ...
	I0831 16:05:27.981418    5342 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2601 bytes)
	I0831 16:05:28.005580    5342 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I0831 16:05:28.308673    5342 command_runner.go:130] > clusterrole.rbac.authorization.k8s.io/kindnet unchanged
	I0831 16:05:28.335149    5342 command_runner.go:130] > clusterrolebinding.rbac.authorization.k8s.io/kindnet unchanged
	I0831 16:05:28.458575    5342 command_runner.go:130] > serviceaccount/kindnet unchanged
	I0831 16:05:28.509058    5342 command_runner.go:130] > daemonset.apps/kindnet configured
	I0831 16:05:28.510491    5342 system_pods.go:43] waiting for kube-system pods to appear ...
	I0831 16:05:28.510535    5342 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I0831 16:05:28.510544    5342 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I0831 16:05:28.510596    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods
	I0831 16:05:28.510601    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:28.510606    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:28.510610    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:28.512904    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:28.512914    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:28.512920    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:28.512924    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:28.512928    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:28 GMT
	I0831 16:05:28.512931    5342 round_trippers.go:580]     Audit-Id: cd084c9b-91e0-4e0e-a47c-d356ac8ac912
	I0831 16:05:28.512935    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:28.512938    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:28.513583    5342 request.go:1351] Response Body: {"kind":"PodList","apiVersion":"v1","metadata":{"resourceVersion":"755"},"items":[{"metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f
:preferredDuringSchedulingIgnoredDuringExecution":{}}},"f:containers":{ [truncated 90530 chars]
	I0831 16:05:28.516761    5342 system_pods.go:59] 12 kube-system pods found
	I0831 16:05:28.516784    5342 system_pods.go:61] "coredns-6f6b679f8f-q4s6r" [b794efa0-8367-452b-90be-870e8d349f6f] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0831 16:05:28.516790    5342 system_pods.go:61] "etcd-multinode-957000" [b4833809-a14f-49f4-b877-9f7e4be0bd39] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I0831 16:05:28.516795    5342 system_pods.go:61] "kindnet-5vc9x" [a8f9df46-0974-4620-a7c1-6022793f34f1] Running / Ready:ContainersNotReady (containers with unready status: [kindnet-cni]) / ContainersReady:ContainersNotReady (containers with unready status: [kindnet-cni])
	I0831 16:05:28.516799    5342 system_pods.go:61] "kindnet-cjqw5" [4a7f98b7-3e6d-4e84-b4ee-6838db3d880b] Running
	I0831 16:05:28.516803    5342 system_pods.go:61] "kindnet-gkhfh" [8c3c358a-7566-4871-a514-82c6190fab18] Running
	I0831 16:05:28.516806    5342 system_pods.go:61] "kube-apiserver-multinode-957000" [e549c883-0eb6-43a1-be40-c8d2f3a9468e] Running / Ready:ContainersNotReady (containers with unready status: [kube-apiserver]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-apiserver])
	I0831 16:05:28.516810    5342 system_pods.go:61] "kube-controller-manager-multinode-957000" [8a82b721-75a3-4460-b9eb-bfc4db35f20e] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I0831 16:05:28.516813    5342 system_pods.go:61] "kube-proxy-cplv4" [56ad32e2-f2ba-4fa5-b093-790a5205b4f2] Running
	I0831 16:05:28.516820    5342 system_pods.go:61] "kube-proxy-ndfs6" [34c16419-4c10-41bd-9446-75ba130cbe63] Running
	I0831 16:05:28.516824    5342 system_pods.go:61] "kube-proxy-zf7j6" [e84c5d55-f27d-4d2a-9b41-6f1e6100ad2e] Running / Ready:ContainersNotReady (containers with unready status: [kube-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-proxy])
	I0831 16:05:28.516828    5342 system_pods.go:61] "kube-scheduler-multinode-957000" [f48d9647-8460-48da-a5b0-fc471f5536ad] Running / Ready:ContainersNotReady (containers with unready status: [kube-scheduler]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-scheduler])
	I0831 16:05:28.516833    5342 system_pods.go:61] "storage-provisioner" [f389bc9a-20cc-4e07-bc7f-f418f53773c9] Running / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I0831 16:05:28.516837    5342 system_pods.go:74] duration metric: took 6.336594ms to wait for pod list to return data ...
	I0831 16:05:28.516844    5342 node_conditions.go:102] verifying NodePressure condition ...
	I0831 16:05:28.516888    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes
	I0831 16:05:28.516892    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:28.516897    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:28.516900    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:28.521789    5342 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 16:05:28.521806    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:28.521815    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:28.521821    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:28.521826    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:28.521829    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:28.521831    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:28 GMT
	I0831 16:05:28.521834    5342 round_trippers.go:580]     Audit-Id: 8342b728-9c88-4e6b-bd7c-8f3abab46d30
	I0831 16:05:28.522092    5342 request.go:1351] Response Body: {"kind":"NodeList","apiVersion":"v1","metadata":{"resourceVersion":"755"},"items":[{"metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"743","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFiel
ds":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time" [truncated 14782 chars]
	I0831 16:05:28.523299    5342 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 16:05:28.523453    5342 node_conditions.go:123] node cpu capacity is 2
	I0831 16:05:28.523472    5342 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 16:05:28.523479    5342 node_conditions.go:123] node cpu capacity is 2
	I0831 16:05:28.523485    5342 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 16:05:28.523494    5342 node_conditions.go:123] node cpu capacity is 2
	I0831 16:05:28.523499    5342 node_conditions.go:105] duration metric: took 6.651631ms to run NodePressure ...
	I0831 16:05:28.523519    5342 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm init phase addon all --config /var/tmp/minikube/kubeadm.yaml"
	I0831 16:05:28.596032    5342 command_runner.go:130] ! W0831 23:05:28.715304    2081 common.go:101] your configuration file uses a deprecated API spec: "kubeadm.k8s.io/v1beta3" (kind: "ClusterConfiguration"). Please use 'kubeadm config migrate --old-config old.yaml --new-config new.yaml', which will write the new, similar spec using a newer API version.
	I0831 16:05:28.596569    5342 command_runner.go:130] ! W0831 23:05:28.715936    2081 common.go:101] your configuration file uses a deprecated API spec: "kubeadm.k8s.io/v1beta3" (kind: "InitConfiguration"). Please use 'kubeadm config migrate --old-config old.yaml --new-config new.yaml', which will write the new, similar spec using a newer API version.
	I0831 16:05:28.812453    5342 command_runner.go:130] > [addons] Applied essential addon: CoreDNS
	I0831 16:05:28.812468    5342 command_runner.go:130] > [addons] Applied essential addon: kube-proxy
	I0831 16:05:28.812480    5342 kubeadm.go:724] waiting for restarted kubelet to initialise ...
	I0831 16:05:28.812537    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods?labelSelector=tier%3Dcontrol-plane
	I0831 16:05:28.812544    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:28.812550    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:28.812553    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:28.814545    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:28.814559    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:28.814567    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:28.814573    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:28.814577    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:28.814581    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:28.814585    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:28 GMT
	I0831 16:05:28.814590    5342 round_trippers.go:580]     Audit-Id: 5bf3e488-f4ee-4704-828a-f99821a2de48
	I0831 16:05:28.814828    5342 request.go:1351] Response Body: {"kind":"PodList","apiVersion":"v1","metadata":{"resourceVersion":"760"},"items":[{"metadata":{"name":"etcd-multinode-957000","namespace":"kube-system","uid":"b4833809-a14f-49f4-b877-9f7e4be0bd39","resourceVersion":"752","creationTimestamp":"2024-08-31T22:57:31Z","labels":{"component":"etcd","tier":"control-plane"},"annotations":{"kubeadm.kubernetes.io/etcd.advertise-client-urls":"https://192.169.0.13:2379","kubernetes.io/config.hash":"7ee006dc216d695a2fa4355a2abea57a","kubernetes.io/config.mirror":"7ee006dc216d695a2fa4355a2abea57a","kubernetes.io/config.seen":"2024-08-31T22:57:31.349647295Z","kubernetes.io/config.source":"file"},"ownerReferences":[{"apiVersion":"v1","kind":"Node","name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","controller":true}],"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:31Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:annotations"
:{".":{},"f:kubeadm.kubernetes.io/etcd.advertise-client-urls":{},"f:kub [truncated 31218 chars]
	I0831 16:05:28.815549    5342 kubeadm.go:739] kubelet initialised
	I0831 16:05:28.815559    5342 kubeadm.go:740] duration metric: took 3.067378ms waiting for restarted kubelet to initialise ...
	I0831 16:05:28.815565    5342 pod_ready.go:36] extra waiting up to 4m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 16:05:28.815594    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods
	I0831 16:05:28.815599    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:28.815605    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:28.815608    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:28.817422    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:28.817434    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:28.817441    5342 round_trippers.go:580]     Audit-Id: 87a5714c-a353-4df3-9ba7-7ce131a65b24
	I0831 16:05:28.817446    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:28.817450    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:28.817455    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:28.817462    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:28.817467    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:28 GMT
	I0831 16:05:28.818569    5342 request.go:1351] Response Body: {"kind":"PodList","apiVersion":"v1","metadata":{"resourceVersion":"760"},"items":[{"metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f
:preferredDuringSchedulingIgnoredDuringExecution":{}}},"f:containers":{ [truncated 89937 chars]
	I0831 16:05:28.820477    5342 pod_ready.go:79] waiting up to 4m0s for pod "coredns-6f6b679f8f-q4s6r" in "kube-system" namespace to be "Ready" ...
	I0831 16:05:28.820522    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:28.820527    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:28.820533    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:28.820537    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:28.823039    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:28.823048    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:28.823053    5342 round_trippers.go:580]     Audit-Id: d750058b-0315-4b68-884c-d94f151799e4
	I0831 16:05:28.823057    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:28.823060    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:28.823066    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:28.823070    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:28.823074    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:28 GMT
	I0831 16:05:28.823360    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:28.823639    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:28.823646    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:28.823654    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:28.823659    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:28.825188    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:28.825201    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:28.825208    5342 round_trippers.go:580]     Audit-Id: f5e6445e-db85-4f64-a220-261f1d02f835
	I0831 16:05:28.825212    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:28.825215    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:28.825228    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:28.825233    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:28.825237    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:28 GMT
	I0831 16:05:28.825319    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"743","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5292 chars]
	I0831 16:05:28.825499    5342 pod_ready.go:98] node "multinode-957000" hosting pod "coredns-6f6b679f8f-q4s6r" in "kube-system" namespace is currently not "Ready" (skipping!): node "multinode-957000" has status "Ready":"False"
	I0831 16:05:28.825508    5342 pod_ready.go:82] duration metric: took 5.022158ms for pod "coredns-6f6b679f8f-q4s6r" in "kube-system" namespace to be "Ready" ...
	E0831 16:05:28.825514    5342 pod_ready.go:67] WaitExtra: waitPodCondition: node "multinode-957000" hosting pod "coredns-6f6b679f8f-q4s6r" in "kube-system" namespace is currently not "Ready" (skipping!): node "multinode-957000" has status "Ready":"False"
	I0831 16:05:28.825519    5342 pod_ready.go:79] waiting up to 4m0s for pod "etcd-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:05:28.825551    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/etcd-multinode-957000
	I0831 16:05:28.825556    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:28.825561    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:28.825565    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:28.826956    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:28.826965    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:28.826970    5342 round_trippers.go:580]     Audit-Id: 805bd1a9-151e-4b47-bc0a-6652c38c166b
	I0831 16:05:28.826973    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:28.826976    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:28.826978    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:28.826981    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:28.826983    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:28 GMT
	I0831 16:05:28.827224    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"etcd-multinode-957000","namespace":"kube-system","uid":"b4833809-a14f-49f4-b877-9f7e4be0bd39","resourceVersion":"752","creationTimestamp":"2024-08-31T22:57:31Z","labels":{"component":"etcd","tier":"control-plane"},"annotations":{"kubeadm.kubernetes.io/etcd.advertise-client-urls":"https://192.169.0.13:2379","kubernetes.io/config.hash":"7ee006dc216d695a2fa4355a2abea57a","kubernetes.io/config.mirror":"7ee006dc216d695a2fa4355a2abea57a","kubernetes.io/config.seen":"2024-08-31T22:57:31.349647295Z","kubernetes.io/config.source":"file"},"ownerReferences":[{"apiVersion":"v1","kind":"Node","name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","controller":true}],"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:31Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:annotations":{".":{},"f:kubeadm.kubernetes.io/etcd.advertise-cl
ient-urls":{},"f:kubernetes.io/config.hash":{},"f:kubernetes.io/config. [truncated 6887 chars]
	I0831 16:05:28.827452    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:28.827459    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:28.827464    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:28.827468    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:28.828686    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:28.828694    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:28.828699    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:28 GMT
	I0831 16:05:28.828704    5342 round_trippers.go:580]     Audit-Id: 29e847a6-dae3-44ae-83ca-0b5481e796c6
	I0831 16:05:28.828707    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:28.828713    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:28.828716    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:28.828718    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:28.828974    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"743","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5292 chars]
	I0831 16:05:28.829139    5342 pod_ready.go:98] node "multinode-957000" hosting pod "etcd-multinode-957000" in "kube-system" namespace is currently not "Ready" (skipping!): node "multinode-957000" has status "Ready":"False"
	I0831 16:05:28.829148    5342 pod_ready.go:82] duration metric: took 3.62377ms for pod "etcd-multinode-957000" in "kube-system" namespace to be "Ready" ...
	E0831 16:05:28.829154    5342 pod_ready.go:67] WaitExtra: waitPodCondition: node "multinode-957000" hosting pod "etcd-multinode-957000" in "kube-system" namespace is currently not "Ready" (skipping!): node "multinode-957000" has status "Ready":"False"
	I0831 16:05:28.829163    5342 pod_ready.go:79] waiting up to 4m0s for pod "kube-apiserver-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:05:28.829218    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-multinode-957000
	I0831 16:05:28.829223    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:28.829228    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:28.829231    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:28.830608    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:28.830615    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:28.830619    5342 round_trippers.go:580]     Audit-Id: 959474bb-56ed-4ade-889a-35a4ceb6d79c
	I0831 16:05:28.830623    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:28.830626    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:28.830629    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:28.830632    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:28.830634    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:28 GMT
	I0831 16:05:28.830936    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-apiserver-multinode-957000","namespace":"kube-system","uid":"e549c883-0eb6-43a1-be40-c8d2f3a9468e","resourceVersion":"751","creationTimestamp":"2024-08-31T22:57:31Z","labels":{"component":"kube-apiserver","tier":"control-plane"},"annotations":{"kubeadm.kubernetes.io/kube-apiserver.advertise-address.endpoint":"192.169.0.13:8443","kubernetes.io/config.hash":"5db461e18c39888a5ab16fd535bfcb2e","kubernetes.io/config.mirror":"5db461e18c39888a5ab16fd535bfcb2e","kubernetes.io/config.seen":"2024-08-31T22:57:31.349647948Z","kubernetes.io/config.source":"file"},"ownerReferences":[{"apiVersion":"v1","kind":"Node","name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","controller":true}],"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:31Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:annotations":{".":{},"f:kubeadm.kube
rnetes.io/kube-apiserver.advertise-address.endpoint":{},"f:kubernetes.i [truncated 8135 chars]
	I0831 16:05:28.831163    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:28.831170    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:28.831175    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:28.831180    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:28.832368    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:28.832375    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:28.832379    5342 round_trippers.go:580]     Audit-Id: 4f067dc8-a0cc-4d76-8356-4fd9c2d7ee79
	I0831 16:05:28.832383    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:28.832385    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:28.832387    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:28.832390    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:28.832392    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:28 GMT
	I0831 16:05:28.832542    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"743","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5292 chars]
	I0831 16:05:28.832706    5342 pod_ready.go:98] node "multinode-957000" hosting pod "kube-apiserver-multinode-957000" in "kube-system" namespace is currently not "Ready" (skipping!): node "multinode-957000" has status "Ready":"False"
	I0831 16:05:28.832715    5342 pod_ready.go:82] duration metric: took 3.547112ms for pod "kube-apiserver-multinode-957000" in "kube-system" namespace to be "Ready" ...
	E0831 16:05:28.832721    5342 pod_ready.go:67] WaitExtra: waitPodCondition: node "multinode-957000" hosting pod "kube-apiserver-multinode-957000" in "kube-system" namespace is currently not "Ready" (skipping!): node "multinode-957000" has status "Ready":"False"
	I0831 16:05:28.832728    5342 pod_ready.go:79] waiting up to 4m0s for pod "kube-controller-manager-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:05:28.832757    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-multinode-957000
	I0831 16:05:28.832762    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:28.832767    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:28.832771    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:28.834057    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:28.834064    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:28.834068    5342 round_trippers.go:580]     Audit-Id: ff31b36b-3d79-4ded-b0be-8f102ce8f578
	I0831 16:05:28.834071    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:28.834089    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:28.834093    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:28.834095    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:28.834098    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:28 GMT
	I0831 16:05:28.834600    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-controller-manager-multinode-957000","namespace":"kube-system","uid":"8a82b721-75a3-4460-b9eb-bfc4db35f20e","resourceVersion":"749","creationTimestamp":"2024-08-31T22:57:31Z","labels":{"component":"kube-controller-manager","tier":"control-plane"},"annotations":{"kubernetes.io/config.hash":"9edb08d8378ca77b90e86ed290d828c5","kubernetes.io/config.mirror":"9edb08d8378ca77b90e86ed290d828c5","kubernetes.io/config.seen":"2024-08-31T22:57:31.349643093Z","kubernetes.io/config.source":"file"},"ownerReferences":[{"apiVersion":"v1","kind":"Node","name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","controller":true}],"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:31Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:annotations":{".":{},"f:kubernetes.io/config.hash":{},"f:kubernetes.io/config.mirror":{},"f:kubernetes.i
o/config.seen":{},"f:kubernetes.io/config.source":{}},"f:labels":{".":{ [truncated 7726 chars]
	I0831 16:05:28.910762    5342 request.go:632] Waited for 75.894986ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:28.910817    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:28.910862    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:28.910877    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:28.910886    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:28.914291    5342 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 16:05:28.914305    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:28.914314    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:29 GMT
	I0831 16:05:28.914318    5342 round_trippers.go:580]     Audit-Id: 68d59604-4c53-4726-8f1c-8076aba9939f
	I0831 16:05:28.914323    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:28.914327    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:28.914331    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:28.914339    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:28.914663    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"743","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5292 chars]
	I0831 16:05:28.914851    5342 pod_ready.go:98] node "multinode-957000" hosting pod "kube-controller-manager-multinode-957000" in "kube-system" namespace is currently not "Ready" (skipping!): node "multinode-957000" has status "Ready":"False"
	I0831 16:05:28.914866    5342 pod_ready.go:82] duration metric: took 82.133346ms for pod "kube-controller-manager-multinode-957000" in "kube-system" namespace to be "Ready" ...
	E0831 16:05:28.914873    5342 pod_ready.go:67] WaitExtra: waitPodCondition: node "multinode-957000" hosting pod "kube-controller-manager-multinode-957000" in "kube-system" namespace is currently not "Ready" (skipping!): node "multinode-957000" has status "Ready":"False"
	I0831 16:05:28.914879    5342 pod_ready.go:79] waiting up to 4m0s for pod "kube-proxy-cplv4" in "kube-system" namespace to be "Ready" ...
	I0831 16:05:29.111261    5342 request.go:632] Waited for 196.272605ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cplv4
	I0831 16:05:29.111309    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cplv4
	I0831 16:05:29.111317    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:29.111328    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:29.111335    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:29.114332    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:29.114345    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:29.114353    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:29.114360    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:29 GMT
	I0831 16:05:29.114370    5342 round_trippers.go:580]     Audit-Id: cd3ea13f-4d8c-47ca-91ce-56e267d2f6ef
	I0831 16:05:29.114380    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:29.114388    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:29.114399    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:29.114739    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-proxy-cplv4","generateName":"kube-proxy-","namespace":"kube-system","uid":"56ad32e2-f2ba-4fa5-b093-790a5205b4f2","resourceVersion":"476","creationTimestamp":"2024-08-31T22:58:18Z","labels":{"controller-revision-hash":"5976bc5f75","k8s-app":"kube-proxy","pod-template-generation":"1"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"DaemonSet","name":"kube-proxy","uid":"7b2d5815-fd80-401f-9040-ee043a6144ec","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:58:18Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:controller-revision-hash":{},"f:k8s-app":{},"f:pod-template-generation":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"7b2d5815-fd80-401f-9040-ee043a6144ec\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:nodeAffinity":{".":{},"f:r
equiredDuringSchedulingIgnoredDuringExecution":{}}},"f:containers":{"k: [truncated 6197 chars]
	I0831 16:05:29.310743    5342 request.go:632] Waited for 195.61869ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:05:29.310833    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:05:29.310843    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:29.310854    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:29.310865    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:29.313601    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:29.313613    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:29.313621    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:29.313627    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:29.313632    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:29.313638    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:29 GMT
	I0831 16:05:29.313643    5342 round_trippers.go:580]     Audit-Id: a7015190-6804-451a-9d1a-17de98a54c73
	I0831 16:05:29.313648    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:29.313832    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"80356a3f-91f2-42b6-b267-2e41c24b1477","resourceVersion":"542","creationTimestamp":"2024-08-31T22:58:18Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T15_58_18_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubeadm","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:58:18Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{"f:kubeadm.alpha.kubernetes.io/cri-socket":{}}}}},{"man [truncated 3818 chars]
	I0831 16:05:29.314064    5342 pod_ready.go:93] pod "kube-proxy-cplv4" in "kube-system" namespace has status "Ready":"True"
	I0831 16:05:29.314076    5342 pod_ready.go:82] duration metric: took 399.189129ms for pod "kube-proxy-cplv4" in "kube-system" namespace to be "Ready" ...
	I0831 16:05:29.314086    5342 pod_ready.go:79] waiting up to 4m0s for pod "kube-proxy-ndfs6" in "kube-system" namespace to be "Ready" ...
	I0831 16:05:29.510851    5342 request.go:632] Waited for 196.714507ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-ndfs6
	I0831 16:05:29.510945    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-ndfs6
	I0831 16:05:29.510955    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:29.510966    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:29.510974    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:29.513589    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:29.513605    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:29.513630    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:29.513644    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:29 GMT
	I0831 16:05:29.513652    5342 round_trippers.go:580]     Audit-Id: 36b070ef-4af5-4672-aa05-930bba3fb245
	I0831 16:05:29.513657    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:29.513664    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:29.513671    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:29.513824    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-proxy-ndfs6","generateName":"kube-proxy-","namespace":"kube-system","uid":"34c16419-4c10-41bd-9446-75ba130cbe63","resourceVersion":"707","creationTimestamp":"2024-08-31T22:59:10Z","labels":{"controller-revision-hash":"5976bc5f75","k8s-app":"kube-proxy","pod-template-generation":"1"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"DaemonSet","name":"kube-proxy","uid":"7b2d5815-fd80-401f-9040-ee043a6144ec","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:59:10Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:controller-revision-hash":{},"f:k8s-app":{},"f:pod-template-generation":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"7b2d5815-fd80-401f-9040-ee043a6144ec\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:nodeAffinity":{".":{},"f:r
equiredDuringSchedulingIgnoredDuringExecution":{}}},"f:containers":{"k: [truncated 6197 chars]
	I0831 16:05:29.710736    5342 request.go:632] Waited for 196.599295ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:05:29.710853    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:05:29.710862    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:29.710903    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:29.710920    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:29.713769    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:29.713781    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:29.713788    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:29 GMT
	I0831 16:05:29.713792    5342 round_trippers.go:580]     Audit-Id: da5f6ffb-eda7-4ac7-b2d1-126bcd004cf6
	I0831 16:05:29.713802    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:29.713806    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:29.713811    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:29.713814    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:29.713929    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"0867ece2-944d-429d-b3c6-0eab243276ee","resourceVersion":"732","creationTimestamp":"2024-08-31T23:00:04Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_00_04_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubeadm","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:00:04Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{"f:kubeadm.alpha.kubernetes.io/cri-socket":{}}}}},{"man [truncated 3635 chars]
	I0831 16:05:29.714159    5342 pod_ready.go:93] pod "kube-proxy-ndfs6" in "kube-system" namespace has status "Ready":"True"
	I0831 16:05:29.714170    5342 pod_ready.go:82] duration metric: took 400.074957ms for pod "kube-proxy-ndfs6" in "kube-system" namespace to be "Ready" ...
	I0831 16:05:29.714179    5342 pod_ready.go:79] waiting up to 4m0s for pod "kube-proxy-zf7j6" in "kube-system" namespace to be "Ready" ...
	I0831 16:05:29.912079    5342 request.go:632] Waited for 197.809333ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zf7j6
	I0831 16:05:29.912158    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zf7j6
	I0831 16:05:29.912168    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:29.912178    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:29.912188    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:29.914968    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:29.914981    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:29.914989    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:29.915036    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:29.915044    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:29.915047    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:29.915050    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:30 GMT
	I0831 16:05:29.915054    5342 round_trippers.go:580]     Audit-Id: 3116d3ac-7fe7-42d3-a6d9-e42aef2c7db7
	I0831 16:05:29.915212    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-proxy-zf7j6","generateName":"kube-proxy-","namespace":"kube-system","uid":"e84c5d55-f27d-4d2a-9b41-6f1e6100ad2e","resourceVersion":"756","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"controller-revision-hash":"5976bc5f75","k8s-app":"kube-proxy","pod-template-generation":"1"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"DaemonSet","name":"kube-proxy","uid":"7b2d5815-fd80-401f-9040-ee043a6144ec","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:controller-revision-hash":{},"f:k8s-app":{},"f:pod-template-generation":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"7b2d5815-fd80-401f-9040-ee043a6144ec\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:nodeAffinity":{".":{},"f:r
equiredDuringSchedulingIgnoredDuringExecution":{}}},"f:containers":{"k: [truncated 6394 chars]
	I0831 16:05:30.111114    5342 request.go:632] Waited for 195.517364ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:30.111231    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:30.111242    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:30.111256    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:30.111263    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:30.114773    5342 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 16:05:30.114792    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:30.114801    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:30 GMT
	I0831 16:05:30.114813    5342 round_trippers.go:580]     Audit-Id: 4f571c62-1083-4281-822e-d7a271d809c2
	I0831 16:05:30.114818    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:30.114822    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:30.114826    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:30.114831    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:30.115327    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:30.115603    5342 pod_ready.go:98] node "multinode-957000" hosting pod "kube-proxy-zf7j6" in "kube-system" namespace is currently not "Ready" (skipping!): node "multinode-957000" has status "Ready":"False"
	I0831 16:05:30.115613    5342 pod_ready.go:82] duration metric: took 401.427015ms for pod "kube-proxy-zf7j6" in "kube-system" namespace to be "Ready" ...
	E0831 16:05:30.115620    5342 pod_ready.go:67] WaitExtra: waitPodCondition: node "multinode-957000" hosting pod "kube-proxy-zf7j6" in "kube-system" namespace is currently not "Ready" (skipping!): node "multinode-957000" has status "Ready":"False"
	I0831 16:05:30.115625    5342 pod_ready.go:79] waiting up to 4m0s for pod "kube-scheduler-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:05:30.310939    5342 request.go:632] Waited for 195.273039ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-multinode-957000
	I0831 16:05:30.311003    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-multinode-957000
	I0831 16:05:30.311009    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:30.311015    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:30.311019    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:30.319096    5342 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0831 16:05:30.319108    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:30.319113    5342 round_trippers.go:580]     Audit-Id: abb60614-1eff-466a-9f36-490c50e9a3db
	I0831 16:05:30.319117    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:30.319120    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:30.319122    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:30.319124    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:30.319127    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:30 GMT
	I0831 16:05:30.319392    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-scheduler-multinode-957000","namespace":"kube-system","uid":"f48d9647-8460-48da-a5b0-fc471f5536ad","resourceVersion":"750","creationTimestamp":"2024-08-31T22:57:31Z","labels":{"component":"kube-scheduler","tier":"control-plane"},"annotations":{"kubernetes.io/config.hash":"b74e8393ad84ccbcf23f7560eda422b0","kubernetes.io/config.mirror":"b74e8393ad84ccbcf23f7560eda422b0","kubernetes.io/config.seen":"2024-08-31T22:57:31.349646560Z","kubernetes.io/config.source":"file"},"ownerReferences":[{"apiVersion":"v1","kind":"Node","name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","controller":true}],"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:31Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:annotations":{".":{},"f:kubernetes.io/config.hash":{},"f:kubernetes.io/config.mirror":{},"f:kubernetes.io/config.seen":{},
"f:kubernetes.io/config.source":{}},"f:labels":{".":{},"f:component":{} [truncated 5438 chars]
	I0831 16:05:30.511213    5342 request.go:632] Waited for 191.57145ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:30.511335    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:30.511345    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:30.511357    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:30.511363    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:30.514655    5342 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 16:05:30.514671    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:30.514687    5342 round_trippers.go:580]     Audit-Id: 621dbd66-a7f5-4695-aa42-53ce43f003a1
	I0831 16:05:30.514692    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:30.514696    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:30.514700    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:30.514705    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:30.514708    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:30 GMT
	I0831 16:05:30.515327    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:30.515593    5342 pod_ready.go:98] node "multinode-957000" hosting pod "kube-scheduler-multinode-957000" in "kube-system" namespace is currently not "Ready" (skipping!): node "multinode-957000" has status "Ready":"False"
	I0831 16:05:30.515606    5342 pod_ready.go:82] duration metric: took 399.973013ms for pod "kube-scheduler-multinode-957000" in "kube-system" namespace to be "Ready" ...
	E0831 16:05:30.515614    5342 pod_ready.go:67] WaitExtra: waitPodCondition: node "multinode-957000" hosting pod "kube-scheduler-multinode-957000" in "kube-system" namespace is currently not "Ready" (skipping!): node "multinode-957000" has status "Ready":"False"
	I0831 16:05:30.515620    5342 pod_ready.go:39] duration metric: took 1.700038562s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 16:05:30.515639    5342 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0831 16:05:30.527258    5342 command_runner.go:130] > -16
	I0831 16:05:30.527507    5342 ops.go:34] apiserver oom_adj: -16
	I0831 16:05:30.527514    5342 kubeadm.go:597] duration metric: took 8.469608944s to restartPrimaryControlPlane
	I0831 16:05:30.527520    5342 kubeadm.go:394] duration metric: took 8.490098609s to StartCluster
	I0831 16:05:30.527529    5342 settings.go:142] acquiring lock: {Name:mk4b1b0a7439feab82be8f6d66b4d3c4d11c9b5f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 16:05:30.527634    5342 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 16:05:30.528073    5342 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/18943-957/kubeconfig: {Name:mkc7259a3f17d77b84078e55eed4ed8b5d2486ca Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 16:05:30.528421    5342 start.go:235] Will wait 6m0s for node &{Name: IP:192.169.0.13 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0831 16:05:30.528436    5342 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0831 16:05:30.528571    5342 config.go:182] Loaded profile config "multinode-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 16:05:30.549010    5342 out.go:177] * Verifying Kubernetes components...
	I0831 16:05:30.591589    5342 out.go:177] * Enabled addons: 
	I0831 16:05:30.612739    5342 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 16:05:30.633553    5342 addons.go:510] duration metric: took 105.123025ms for enable addons: enabled=[]
	I0831 16:05:30.759192    5342 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 16:05:30.772882    5342 node_ready.go:35] waiting up to 6m0s for node "multinode-957000" to be "Ready" ...
	I0831 16:05:30.772946    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:30.772952    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:30.772958    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:30.772962    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:30.774319    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:30.774331    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:30.774336    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:30.774339    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:30.774344    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:30 GMT
	I0831 16:05:30.774346    5342 round_trippers.go:580]     Audit-Id: d08f0d9b-38a9-47b7-8b4e-a27f55f15ed8
	I0831 16:05:30.774349    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:30.774354    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:30.774436    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:31.273231    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:31.273255    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:31.273267    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:31.273276    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:31.275455    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:31.275470    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:31.275478    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:31.275483    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:31.275486    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:31.275490    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:31.275493    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:31 GMT
	I0831 16:05:31.275499    5342 round_trippers.go:580]     Audit-Id: a6884251-65f4-4dea-b5af-8ba6be0286ba
	I0831 16:05:31.275742    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:31.773446    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:31.773471    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:31.773482    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:31.773488    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:31.776090    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:31.776105    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:31.776113    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:31.776118    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:31.776122    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:31 GMT
	I0831 16:05:31.776126    5342 round_trippers.go:580]     Audit-Id: 54f202c1-0791-4cf4-88b7-012bb2332609
	I0831 16:05:31.776129    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:31.776132    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:31.776348    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:32.273509    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:32.273568    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:32.273583    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:32.273589    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:32.275898    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:32.275918    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:32.275925    5342 round_trippers.go:580]     Audit-Id: 5908ac62-1c6a-49fc-8109-96f9d93cda50
	I0831 16:05:32.275929    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:32.275935    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:32.275939    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:32.275950    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:32.275954    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:32 GMT
	I0831 16:05:32.276053    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:32.773717    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:32.773740    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:32.773751    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:32.773758    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:32.776648    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:32.776694    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:32.776709    5342 round_trippers.go:580]     Audit-Id: 90eaff7e-9fec-4c29-80db-74582226e748
	I0831 16:05:32.776714    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:32.776719    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:32.776723    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:32.776727    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:32.776732    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:32 GMT
	I0831 16:05:32.776849    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:32.777135    5342 node_ready.go:53] node "multinode-957000" has status "Ready":"False"
	I0831 16:05:33.273444    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:33.273470    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:33.273484    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:33.273491    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:33.276080    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:33.276096    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:33.276104    5342 round_trippers.go:580]     Audit-Id: 8a19d5ce-9985-42f2-9db1-710a7c704e2c
	I0831 16:05:33.276119    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:33.276125    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:33.276128    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:33.276133    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:33.276139    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:33 GMT
	I0831 16:05:33.276316    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:33.773594    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:33.773619    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:33.773631    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:33.773639    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:33.776646    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:33.776661    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:33.776668    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:33.776672    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:33.776677    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:33 GMT
	I0831 16:05:33.776681    5342 round_trippers.go:580]     Audit-Id: 01a896ff-b68a-441e-a74c-e8f20bd86504
	I0831 16:05:33.776686    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:33.776692    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:33.776986    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:34.273570    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:34.273593    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:34.273605    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:34.273611    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:34.276636    5342 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 16:05:34.276650    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:34.276657    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:34 GMT
	I0831 16:05:34.276662    5342 round_trippers.go:580]     Audit-Id: 4a493aa0-25a2-4c4a-b69c-b648e9f4af02
	I0831 16:05:34.276665    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:34.276668    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:34.276671    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:34.276675    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:34.276761    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:34.773276    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:34.773304    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:34.773317    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:34.773325    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:34.776499    5342 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 16:05:34.776516    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:34.776523    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:34.776527    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:34.776539    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:34.776544    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:34 GMT
	I0831 16:05:34.776547    5342 round_trippers.go:580]     Audit-Id: 4436eee7-3063-4023-aba7-209f64b71aab
	I0831 16:05:34.776555    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:34.776646    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:35.273227    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:35.273251    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:35.273262    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:35.273269    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:35.275946    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:35.275959    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:35.275968    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:35.275973    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:35.275977    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:35.275982    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:35 GMT
	I0831 16:05:35.275987    5342 round_trippers.go:580]     Audit-Id: dc1e06cd-4d2f-4af1-867c-0049e70d6497
	I0831 16:05:35.275991    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:35.276084    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:35.276339    5342 node_ready.go:53] node "multinode-957000" has status "Ready":"False"
	I0831 16:05:35.773929    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:35.773954    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:35.773965    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:35.773970    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:35.776314    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:35.776329    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:35.776343    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:35.776351    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:35.776358    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:35.776364    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:35.776393    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:35 GMT
	I0831 16:05:35.776402    5342 round_trippers.go:580]     Audit-Id: e93c2bb0-3365-4cc2-bdb4-940f5c8ae25a
	I0831 16:05:35.776641    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:36.273653    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:36.273676    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:36.273688    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:36.273694    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:36.276155    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:36.276167    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:36.276175    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:36.276184    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:36.276195    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:36.276209    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:36 GMT
	I0831 16:05:36.276214    5342 round_trippers.go:580]     Audit-Id: 76f20144-9535-4f88-902c-dfe92dbb00fd
	I0831 16:05:36.276219    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:36.276485    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:36.773408    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:36.773424    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:36.773430    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:36.773436    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:36.775126    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:36.775136    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:36.775153    5342 round_trippers.go:580]     Audit-Id: 0fb2ef7d-6a22-4e9a-960c-025b1a6eafe9
	I0831 16:05:36.775158    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:36.775162    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:36.775168    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:36.775170    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:36.775173    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:36 GMT
	I0831 16:05:36.775382    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:37.274386    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:37.274410    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:37.274422    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:37.274427    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:37.277231    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:37.277249    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:37.277260    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:37.277267    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:37 GMT
	I0831 16:05:37.277273    5342 round_trippers.go:580]     Audit-Id: ca00da9b-b107-4a72-a5a5-fa52673c1c98
	I0831 16:05:37.277279    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:37.277284    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:37.277314    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:37.277424    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:37.277686    5342 node_ready.go:53] node "multinode-957000" has status "Ready":"False"
	I0831 16:05:37.773170    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:37.773198    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:37.773208    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:37.773213    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:37.775904    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:37.775917    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:37.775924    5342 round_trippers.go:580]     Audit-Id: d0a87598-fdaa-440f-874b-8c1aca80157a
	I0831 16:05:37.775928    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:37.775931    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:37.775935    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:37.775939    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:37.775943    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:37 GMT
	I0831 16:05:37.776032    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:38.274007    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:38.274032    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:38.274043    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:38.274048    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:38.276883    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:38.276900    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:38.276910    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:38.276915    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:38.276918    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:38 GMT
	I0831 16:05:38.276922    5342 round_trippers.go:580]     Audit-Id: c6fcafbe-08bf-4651-ad27-20cf8cec7f2b
	I0831 16:05:38.276926    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:38.276931    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:38.277029    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:38.773127    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:38.773150    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:38.773162    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:38.773168    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:38.775923    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:38.775934    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:38.775941    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:38.775947    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:38.775951    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:38.775958    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:38 GMT
	I0831 16:05:38.775962    5342 round_trippers.go:580]     Audit-Id: f854ed6f-0099-4e3b-8a7b-3bb1997375ea
	I0831 16:05:38.775966    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:38.776050    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:39.274249    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:39.274277    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:39.274326    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:39.274336    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:39.276993    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:39.277011    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:39.277021    5342 round_trippers.go:580]     Audit-Id: 03b4316f-d14d-4116-b4c1-813221c6bdc9
	I0831 16:05:39.277028    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:39.277037    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:39.277045    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:39.277049    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:39.277052    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:39 GMT
	I0831 16:05:39.277253    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:39.774616    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:39.774658    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:39.774668    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:39.774674    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:39.776564    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:39.776578    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:39.776586    5342 round_trippers.go:580]     Audit-Id: 3a63d328-9b93-4bca-a1b2-3aa3675066b9
	I0831 16:05:39.776592    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:39.776601    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:39.776611    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:39.776617    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:39.776620    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:39 GMT
	I0831 16:05:39.776700    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:39.776888    5342 node_ready.go:53] node "multinode-957000" has status "Ready":"False"
	I0831 16:05:40.273144    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:40.273166    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:40.273178    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:40.273185    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:40.275620    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:40.275634    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:40.275640    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:40.275644    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:40 GMT
	I0831 16:05:40.275647    5342 round_trippers.go:580]     Audit-Id: d4d6a13d-0f03-4ec0-8fa5-3e9bbc93f10b
	I0831 16:05:40.275650    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:40.275654    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:40.275657    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:40.275799    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:40.775126    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:40.775154    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:40.775166    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:40.775173    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:40.777830    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:40.777843    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:40.777850    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:40.777854    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:40 GMT
	I0831 16:05:40.777858    5342 round_trippers.go:580]     Audit-Id: 3818c951-29e5-48c5-916c-016242978fe0
	I0831 16:05:40.777861    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:40.777866    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:40.777871    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:40.778012    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:41.273163    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:41.273193    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:41.273206    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:41.273213    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:41.275822    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:41.275839    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:41.275847    5342 round_trippers.go:580]     Audit-Id: bd4a0ff8-2775-40f6-a5ee-6d8e9c2684b6
	I0831 16:05:41.275853    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:41.275857    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:41.275860    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:41.275863    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:41.275867    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:41 GMT
	I0831 16:05:41.275952    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:41.773808    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:41.773865    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:41.773878    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:41.773888    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:41.776452    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:41.776464    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:41.776471    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:41.776498    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:41.776506    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:41 GMT
	I0831 16:05:41.776511    5342 round_trippers.go:580]     Audit-Id: a3b3c6d2-83cf-4936-8898-078e2bd50d48
	I0831 16:05:41.776514    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:41.776519    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:41.776602    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:42.273148    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:42.273182    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:42.273190    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:42.273193    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:42.274828    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:42.274839    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:42.274844    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:42.274847    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:42.274850    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:42.274852    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:42.274854    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:42 GMT
	I0831 16:05:42.274857    5342 round_trippers.go:580]     Audit-Id: 2ae4205b-8426-464c-b8cc-fbed0ac827d2
	I0831 16:05:42.274925    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:42.275117    5342 node_ready.go:53] node "multinode-957000" has status "Ready":"False"
	I0831 16:05:42.773734    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:42.773754    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:42.773766    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:42.773773    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:42.776397    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:42.776414    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:42.776421    5342 round_trippers.go:580]     Audit-Id: e3e584f5-da1c-4657-8d9a-968a1cafb2ef
	I0831 16:05:42.776427    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:42.776431    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:42.776434    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:42.776437    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:42.776442    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:42 GMT
	I0831 16:05:42.776523    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:43.273249    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:43.273271    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:43.273286    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:43.273293    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:43.275660    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:43.275672    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:43.275679    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:43 GMT
	I0831 16:05:43.275684    5342 round_trippers.go:580]     Audit-Id: e642dbd1-81bb-4dfe-8a69-3c728d246fda
	I0831 16:05:43.275688    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:43.275693    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:43.275696    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:43.275699    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:43.276005    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:43.773190    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:43.773218    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:43.773229    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:43.773234    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:43.775699    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:43.775714    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:43.775721    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:43 GMT
	I0831 16:05:43.775725    5342 round_trippers.go:580]     Audit-Id: 2854d35a-927d-4397-93ba-d52e3752a4e2
	I0831 16:05:43.775728    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:43.775732    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:43.775735    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:43.775738    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:43.775812    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:44.274371    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:44.274383    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:44.274389    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:44.274392    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:44.276187    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:44.276201    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:44.276210    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:44.276216    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:44.276229    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:44 GMT
	I0831 16:05:44.276241    5342 round_trippers.go:580]     Audit-Id: 60b6e580-4530-48dd-9df1-665ea775f8d3
	I0831 16:05:44.276246    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:44.276249    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:44.276501    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:44.276738    5342 node_ready.go:53] node "multinode-957000" has status "Ready":"False"
	I0831 16:05:44.773683    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:44.773705    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:44.773716    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:44.773725    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:44.776291    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:44.776307    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:44.776315    5342 round_trippers.go:580]     Audit-Id: 2120b79d-1a9e-44c2-b300-f736aed25210
	I0831 16:05:44.776320    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:44.776326    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:44.776331    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:44.776337    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:44.776342    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:44 GMT
	I0831 16:05:44.776589    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:45.274846    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:45.274869    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:45.274887    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:45.274898    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:45.277487    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:45.277503    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:45.277511    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:45.277515    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:45.277519    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:45 GMT
	I0831 16:05:45.277523    5342 round_trippers.go:580]     Audit-Id: 035b28db-7075-46f2-89e4-46832cb6271d
	I0831 16:05:45.277525    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:45.277528    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:45.277610    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:45.773617    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:45.773712    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:45.773726    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:45.773734    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:45.776523    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:45.776553    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:45.776581    5342 round_trippers.go:580]     Audit-Id: 42a4dc2b-f358-487c-aaef-35924bd58252
	I0831 16:05:45.776587    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:45.776592    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:45.776596    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:45.776600    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:45.776604    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:45 GMT
	I0831 16:05:45.776845    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:46.273321    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:46.273356    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:46.273369    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:46.273375    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:46.276211    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:46.276226    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:46.276233    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:46 GMT
	I0831 16:05:46.276238    5342 round_trippers.go:580]     Audit-Id: cb7de204-b21d-4b06-b350-5818c94e59b6
	I0831 16:05:46.276242    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:46.276246    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:46.276250    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:46.276254    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:46.276601    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"765","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5508 chars]
	I0831 16:05:46.276880    5342 node_ready.go:53] node "multinode-957000" has status "Ready":"False"
	I0831 16:05:46.773242    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:46.773269    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:46.773280    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:46.773287    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:46.776034    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:46.776049    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:46.776056    5342 round_trippers.go:580]     Audit-Id: 7cc81b9c-4c21-4bf4-b0c7-7fc84790d556
	I0831 16:05:46.776061    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:46.776065    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:46.776069    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:46.776073    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:46.776077    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:46 GMT
	I0831 16:05:46.776174    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"866","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5285 chars]
	I0831 16:05:46.776427    5342 node_ready.go:49] node "multinode-957000" has status "Ready":"True"
	I0831 16:05:46.776447    5342 node_ready.go:38] duration metric: took 16.003447569s for node "multinode-957000" to be "Ready" ...
	I0831 16:05:46.776455    5342 pod_ready.go:36] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 16:05:46.776498    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods
	I0831 16:05:46.776506    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:46.776514    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:46.776520    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:46.778824    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:46.778835    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:46.778843    5342 round_trippers.go:580]     Audit-Id: 7800b1ec-0ed4-4349-917f-fbb1a11d6eee
	I0831 16:05:46.778850    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:46.778854    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:46.778860    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:46.778866    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:46.778868    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:46 GMT
	I0831 16:05:46.779471    5342 request.go:1351] Response Body: {"kind":"PodList","apiVersion":"v1","metadata":{"resourceVersion":"866"},"items":[{"metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f
:preferredDuringSchedulingIgnoredDuringExecution":{}}},"f:containers":{ [truncated 88963 chars]
	I0831 16:05:46.781344    5342 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-q4s6r" in "kube-system" namespace to be "Ready" ...
	I0831 16:05:46.781378    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:46.781384    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:46.781389    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:46.781393    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:46.782657    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:46.782669    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:46.782675    5342 round_trippers.go:580]     Audit-Id: ae7b58c0-c312-418f-aa17-d9bfe53f3454
	I0831 16:05:46.782679    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:46.782682    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:46.782685    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:46.782687    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:46.782691    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:46 GMT
	I0831 16:05:46.782867    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:46.783109    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:46.783116    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:46.783121    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:46.783125    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:46.784029    5342 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 16:05:46.784036    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:46.784042    5342 round_trippers.go:580]     Audit-Id: 252db5de-fa20-4fdf-a923-75bc34c02ab9
	I0831 16:05:46.784046    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:46.784050    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:46.784053    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:46.784056    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:46.784059    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:46 GMT
	I0831 16:05:46.784213    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"866","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5285 chars]
	I0831 16:05:47.281784    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:47.281806    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:47.281822    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:47.281827    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:47.284737    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:47.284753    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:47.284760    5342 round_trippers.go:580]     Audit-Id: 08d2a849-7b46-4867-8e32-54d53c59b407
	I0831 16:05:47.284764    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:47.284767    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:47.284771    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:47.284775    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:47.284779    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:47 GMT
	I0831 16:05:47.284858    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:47.285226    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:47.285235    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:47.285243    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:47.285247    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:47.286776    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:47.286785    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:47.286790    5342 round_trippers.go:580]     Audit-Id: 8a57edc9-f8e3-4ac0-a550-bdefd04325be
	I0831 16:05:47.286794    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:47.286797    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:47.286802    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:47.286806    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:47.286812    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:47 GMT
	I0831 16:05:47.287041    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"866","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5285 chars]
	I0831 16:05:47.782102    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:47.782126    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:47.782147    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:47.782154    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:47.784806    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:47.784827    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:47.784835    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:47.784840    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:47.784844    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:47.784847    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:47.784851    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:47 GMT
	I0831 16:05:47.784857    5342 round_trippers.go:580]     Audit-Id: 84df7ad2-f18d-4664-ae2c-39605fe84600
	I0831 16:05:47.784941    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:47.785313    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:47.785323    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:47.785330    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:47.785336    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:47.786956    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:47.786966    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:47.786971    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:47.786974    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:47.786978    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:47.786980    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:47 GMT
	I0831 16:05:47.786982    5342 round_trippers.go:580]     Audit-Id: 5c51a3a1-dff7-4ade-84b7-b7bf47992a6b
	I0831 16:05:47.786985    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:47.787064    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"866","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5285 chars]
	I0831 16:05:48.282751    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:48.282778    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:48.282790    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:48.282797    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:48.285988    5342 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 16:05:48.286003    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:48.286010    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:48.286014    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:48 GMT
	I0831 16:05:48.286018    5342 round_trippers.go:580]     Audit-Id: fd81112f-cf75-4cc2-9586-ab261ab8dcd2
	I0831 16:05:48.286022    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:48.286025    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:48.286030    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:48.286139    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:48.286517    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:48.286527    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:48.286534    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:48.286545    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:48.287963    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:48.287974    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:48.287980    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:48.287994    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:48.288002    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:48.288007    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:48.288012    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:48 GMT
	I0831 16:05:48.288014    5342 round_trippers.go:580]     Audit-Id: ab976624-9007-49dd-8fdf-76c6e171281d
	I0831 16:05:48.288159    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"866","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5285 chars]
	I0831 16:05:48.781588    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:48.781618    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:48.781663    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:48.781672    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:48.784223    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:48.784243    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:48.784253    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:48.784260    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:48.784265    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:48 GMT
	I0831 16:05:48.784275    5342 round_trippers.go:580]     Audit-Id: ac90deb7-7a9e-4e69-85bc-5eb84bedc218
	I0831 16:05:48.784280    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:48.784285    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:48.784576    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:48.784952    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:48.784962    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:48.784970    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:48.784984    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:48.786218    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:48.786227    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:48.786232    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:48.786235    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:48.786238    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:48.786240    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:48.786243    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:48 GMT
	I0831 16:05:48.786247    5342 round_trippers.go:580]     Audit-Id: a2ce0d44-1b9d-41c6-a7b5-0bbdabaa81ff
	I0831 16:05:48.786503    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"866","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5285 chars]
	I0831 16:05:48.786678    5342 pod_ready.go:103] pod "coredns-6f6b679f8f-q4s6r" in "kube-system" namespace has status "Ready":"False"
	I0831 16:05:49.283052    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:49.283079    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:49.283087    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:49.283092    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:49.286373    5342 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 16:05:49.286390    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:49.286399    5342 round_trippers.go:580]     Audit-Id: 62b869c2-3653-41ec-aba2-e7b853d162d6
	I0831 16:05:49.286413    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:49.286420    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:49.286424    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:49.286428    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:49.286431    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:49 GMT
	I0831 16:05:49.286529    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:49.286903    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:49.286912    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:49.286924    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:49.286929    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:49.288963    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:49.288974    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:49.288981    5342 round_trippers.go:580]     Audit-Id: ecfe1eac-e2fe-4c03-aa30-199b889f66e4
	I0831 16:05:49.288987    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:49.288995    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:49.289001    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:49.289006    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:49.289008    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:49 GMT
	I0831 16:05:49.289291    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"866","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5285 chars]
	I0831 16:05:49.783105    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:49.783132    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:49.783145    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:49.783151    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:49.786090    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:49.786107    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:49.786114    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:49.786119    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:49.786123    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:49.786126    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:49.786134    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:49 GMT
	I0831 16:05:49.786138    5342 round_trippers.go:580]     Audit-Id: 53622ff8-314c-45cc-8c4c-73cf7c50c9c8
	I0831 16:05:49.786288    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:49.786675    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:49.786684    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:49.786692    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:49.786714    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:49.788250    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:49.788257    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:49.788261    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:49.788264    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:49 GMT
	I0831 16:05:49.788268    5342 round_trippers.go:580]     Audit-Id: c0f80308-5845-4b66-b94a-c84e51535d1a
	I0831 16:05:49.788271    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:49.788274    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:49.788276    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:49.788376    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:05:50.282624    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:50.282649    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:50.282662    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:50.282671    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:50.285337    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:50.285352    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:50.285359    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:50.285364    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:50.285368    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:50 GMT
	I0831 16:05:50.285371    5342 round_trippers.go:580]     Audit-Id: e6b913e9-f9d7-45a7-a7a7-615d94227b68
	I0831 16:05:50.285379    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:50.285383    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:50.285461    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:50.285829    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:50.285839    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:50.285846    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:50.285850    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:50.287872    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:50.287881    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:50.287886    5342 round_trippers.go:580]     Audit-Id: 04d5760f-6ba7-4d14-9347-1901883cf521
	I0831 16:05:50.287891    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:50.287893    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:50.287896    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:50.287899    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:50.287914    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:50 GMT
	I0831 16:05:50.288003    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:05:50.783381    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:50.783404    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:50.783417    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:50.783427    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:50.786048    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:50.786064    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:50.786071    5342 round_trippers.go:580]     Audit-Id: 9c5b8a72-017d-4897-adb5-a4610b12669d
	I0831 16:05:50.786075    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:50.786079    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:50.786082    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:50.786086    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:50.786090    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:50 GMT
	I0831 16:05:50.786221    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:50.786607    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:50.786617    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:50.786624    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:50.786633    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:50.788007    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:50.788015    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:50.788020    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:50.788024    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:50 GMT
	I0831 16:05:50.788031    5342 round_trippers.go:580]     Audit-Id: cbc9f70b-27a4-44b8-8e2a-05587cf452a2
	I0831 16:05:50.788035    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:50.788038    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:50.788042    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:50.788172    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:05:50.788341    5342 pod_ready.go:103] pod "coredns-6f6b679f8f-q4s6r" in "kube-system" namespace has status "Ready":"False"
	I0831 16:05:51.281612    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:51.281634    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:51.281646    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:51.281654    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:51.284318    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:51.284332    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:51.284340    5342 round_trippers.go:580]     Audit-Id: b341484b-c434-4bd8-86a1-620eec24d3b7
	I0831 16:05:51.284344    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:51.284349    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:51.284352    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:51.284357    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:51.284360    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:51 GMT
	I0831 16:05:51.284462    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:51.284836    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:51.284847    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:51.284854    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:51.284860    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:51.288199    5342 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 16:05:51.288210    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:51.288215    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:51.288219    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:51.288223    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:51.288226    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:51.288228    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:51 GMT
	I0831 16:05:51.288231    5342 round_trippers.go:580]     Audit-Id: 320a4113-b284-48c2-bd2c-e28d120860f8
	I0831 16:05:51.288330    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:05:51.782609    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:51.782644    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:51.782701    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:51.782709    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:51.785339    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:51.785358    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:51.785366    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:51.785371    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:51 GMT
	I0831 16:05:51.785374    5342 round_trippers.go:580]     Audit-Id: c54baba1-8786-4a67-a23d-923048d8ef95
	I0831 16:05:51.785377    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:51.785380    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:51.785383    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:51.785614    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:51.786010    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:51.786020    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:51.786032    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:51.786037    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:51.787212    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:51.787221    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:51.787227    5342 round_trippers.go:580]     Audit-Id: bfa96913-e0a0-47fa-afee-e75e14a0a3f2
	I0831 16:05:51.787234    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:51.787238    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:51.787243    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:51.787247    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:51.787251    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:51 GMT
	I0831 16:05:51.787415    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:05:52.281676    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:52.281702    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:52.281714    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:52.281722    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:52.284643    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:52.284658    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:52.284665    5342 round_trippers.go:580]     Audit-Id: efc738ac-cecf-40b2-85d0-2d603cc830c8
	I0831 16:05:52.284671    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:52.284675    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:52.284679    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:52.284684    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:52.284689    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:52 GMT
	I0831 16:05:52.285024    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:52.285391    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:52.285403    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:52.285410    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:52.285415    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:52.289342    5342 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 16:05:52.289354    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:52.289361    5342 round_trippers.go:580]     Audit-Id: 42e94a09-88ef-4fee-b730-120a1ece6d82
	I0831 16:05:52.289365    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:52.289369    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:52.289372    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:52.289374    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:52.289377    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:52 GMT
	I0831 16:05:52.289489    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:05:52.783026    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:52.783054    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:52.783066    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:52.783072    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:52.785697    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:52.785713    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:52.785720    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:52 GMT
	I0831 16:05:52.785725    5342 round_trippers.go:580]     Audit-Id: 1ac5d869-c704-44ea-a346-0d1e27d7ba98
	I0831 16:05:52.785730    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:52.785737    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:52.785741    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:52.785744    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:52.785904    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:52.786284    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:52.786294    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:52.786301    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:52.786308    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:52.787954    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:52.787964    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:52.787969    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:52.787972    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:52.787978    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:52.787992    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:52.787998    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:52 GMT
	I0831 16:05:52.788001    5342 round_trippers.go:580]     Audit-Id: 3fb2c62a-f7de-4ea0-aacc-50b2c0ac6c3c
	I0831 16:05:52.788123    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:05:53.282403    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:53.282427    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:53.282439    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:53.282444    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:53.285238    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:53.285254    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:53.285261    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:53.285264    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:53.285269    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:53 GMT
	I0831 16:05:53.285273    5342 round_trippers.go:580]     Audit-Id: e0844063-6dd0-45de-b918-102309d8b8b2
	I0831 16:05:53.285276    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:53.285281    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:53.285385    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:53.285762    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:53.285773    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:53.285780    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:53.285786    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:53.289675    5342 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 16:05:53.289685    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:53.289690    5342 round_trippers.go:580]     Audit-Id: 8c1e4d80-bf84-4edc-a481-576c363af1ea
	I0831 16:05:53.289694    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:53.289698    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:53.289702    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:53.289704    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:53.289707    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:53 GMT
	I0831 16:05:53.289770    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:05:53.289951    5342 pod_ready.go:103] pod "coredns-6f6b679f8f-q4s6r" in "kube-system" namespace has status "Ready":"False"
	I0831 16:05:53.783664    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:53.783685    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:53.783695    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:53.783716    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:53.786279    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:53.786292    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:53.786301    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:53.786307    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:53.786313    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:53.786318    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:53 GMT
	I0831 16:05:53.786322    5342 round_trippers.go:580]     Audit-Id: 2dcf3252-d12c-4e39-bd15-02247c13b773
	I0831 16:05:53.786325    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:53.786406    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:53.786773    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:53.786783    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:53.786790    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:53.786794    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:53.788164    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:53.788176    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:53.788183    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:53.788207    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:53.788216    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:53.788220    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:53 GMT
	I0831 16:05:53.788223    5342 round_trippers.go:580]     Audit-Id: 2d68b9da-1d12-4e9d-8283-1fc5a646e256
	I0831 16:05:53.788226    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:53.788495    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:05:54.281974    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:54.281999    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:54.282010    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:54.282017    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:54.284803    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:54.284820    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:54.284830    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:54.284835    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:54.284840    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:54 GMT
	I0831 16:05:54.284845    5342 round_trippers.go:580]     Audit-Id: a302323c-0579-4a85-83db-7f60794ac15b
	I0831 16:05:54.284848    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:54.284854    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:54.285054    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:54.285428    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:54.285438    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:54.285446    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:54.285452    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:54.287041    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:54.287050    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:54.287064    5342 round_trippers.go:580]     Audit-Id: 03527403-e5d5-4dd6-ba3e-f1bcc1fbee43
	I0831 16:05:54.287067    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:54.287069    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:54.287072    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:54.287074    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:54.287079    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:54 GMT
	I0831 16:05:54.287163    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:05:54.783681    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:54.783708    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:54.783720    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:54.783725    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:54.786449    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:54.786464    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:54.786471    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:54.786477    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:54.786483    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:54.786488    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:54 GMT
	I0831 16:05:54.786497    5342 round_trippers.go:580]     Audit-Id: d7021cda-a3be-48c3-b62b-80591833c684
	I0831 16:05:54.786503    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:54.786771    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:54.787152    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:54.787162    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:54.787170    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:54.787175    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:54.788739    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:54.788748    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:54.788753    5342 round_trippers.go:580]     Audit-Id: 27892449-1a2f-40af-8591-36aff2979650
	I0831 16:05:54.788757    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:54.788760    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:54.788762    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:54.788765    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:54.788769    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:54 GMT
	I0831 16:05:54.789328    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:05:55.282218    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:55.282242    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:55.282254    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:55.282261    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:55.285026    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:55.285041    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:55.285047    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:55.285052    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:55.285058    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:55.285062    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:55 GMT
	I0831 16:05:55.285067    5342 round_trippers.go:580]     Audit-Id: 530ea6ec-56b0-440d-8bd0-1411bfd4a48d
	I0831 16:05:55.285070    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:55.285168    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:55.285550    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:55.285559    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:55.285566    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:55.285571    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:55.286899    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:55.286910    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:55.286917    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:55 GMT
	I0831 16:05:55.286921    5342 round_trippers.go:580]     Audit-Id: d138b2c0-4147-4464-954d-4b9bdf53791c
	I0831 16:05:55.286924    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:55.286928    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:55.286942    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:55.286947    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:55.287088    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:05:55.782949    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:55.782965    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:55.782970    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:55.782974    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:55.784546    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:55.784555    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:55.784560    5342 round_trippers.go:580]     Audit-Id: 8e071df8-3b41-45f4-aef9-cfc504af169d
	I0831 16:05:55.784563    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:55.784566    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:55.784569    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:55.784572    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:55.784574    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:55 GMT
	I0831 16:05:55.784749    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:55.785030    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:55.785037    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:55.785043    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:55.785049    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:55.786045    5342 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 16:05:55.786052    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:55.786057    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:55.786060    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:55.786063    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:55 GMT
	I0831 16:05:55.786068    5342 round_trippers.go:580]     Audit-Id: 475643a1-74d3-413a-9c73-ec38534f4101
	I0831 16:05:55.786073    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:55.786079    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:55.786293    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:05:55.786472    5342 pod_ready.go:103] pod "coredns-6f6b679f8f-q4s6r" in "kube-system" namespace has status "Ready":"False"
	I0831 16:05:56.282448    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:56.282472    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:56.282483    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:56.282490    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:56.285461    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:56.285479    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:56.285489    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:56.285494    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:56.285498    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:56.285502    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:56 GMT
	I0831 16:05:56.285507    5342 round_trippers.go:580]     Audit-Id: 7388ea67-37a5-49dd-8f4b-abd4d6987c8a
	I0831 16:05:56.285510    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:56.285595    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:56.286011    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:56.286021    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:56.286029    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:56.286033    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:56.289281    5342 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 16:05:56.289290    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:56.289294    5342 round_trippers.go:580]     Audit-Id: 50454dc3-8124-4d6b-8c09-9311502ba507
	I0831 16:05:56.289298    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:56.289300    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:56.289302    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:56.289305    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:56.289308    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:56 GMT
	I0831 16:05:56.289386    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:05:56.782346    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:56.782371    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:56.782383    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:56.782389    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:56.785098    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:56.785114    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:56.785122    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:56.785127    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:56.785131    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:56.785135    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:56.785151    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:56 GMT
	I0831 16:05:56.785157    5342 round_trippers.go:580]     Audit-Id: 2fb3f68c-dc44-4535-b560-169b3527e9a0
	I0831 16:05:56.785340    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:56.785808    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:56.785816    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:56.785840    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:56.785858    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:56.787759    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:56.787769    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:56.787774    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:56 GMT
	I0831 16:05:56.787778    5342 round_trippers.go:580]     Audit-Id: ccdb285e-a97a-4c4d-8746-d60aeaf34acc
	I0831 16:05:56.787798    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:56.787803    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:56.787806    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:56.787809    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:56.788454    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:05:57.283367    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:57.283395    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:57.283406    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:57.283411    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:57.286262    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:57.286281    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:57.286289    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:57.286293    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:57.286297    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:57 GMT
	I0831 16:05:57.286302    5342 round_trippers.go:580]     Audit-Id: e090a2d0-78c2-4770-b27c-665f9d258d56
	I0831 16:05:57.286305    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:57.286310    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:57.286405    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:57.286789    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:57.286799    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:57.286806    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:57.286811    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:57.288455    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:57.288465    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:57.288470    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:57.288473    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:57.288476    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:57 GMT
	I0831 16:05:57.288479    5342 round_trippers.go:580]     Audit-Id: e7054cda-c35a-41f9-8e1e-a37f1c82d309
	I0831 16:05:57.288482    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:57.288484    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:57.288557    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:05:57.782803    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:57.782830    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:57.782842    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:57.782848    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:57.785941    5342 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 16:05:57.785956    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:57.785963    5342 round_trippers.go:580]     Audit-Id: 1449d501-201c-422d-9ed8-4fbfc2cef051
	I0831 16:05:57.785967    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:57.785970    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:57.785974    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:57.785977    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:57.785980    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:57 GMT
	I0831 16:05:57.786107    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:57.786477    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:57.786486    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:57.786494    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:57.786499    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:57.787941    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:57.787948    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:57.787953    5342 round_trippers.go:580]     Audit-Id: 4a8876b5-ca85-49a2-94ed-d7ebacc6e22d
	I0831 16:05:57.787957    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:57.787962    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:57.787966    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:57.787971    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:57.787973    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:57 GMT
	I0831 16:05:57.788105    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:05:57.788271    5342 pod_ready.go:103] pod "coredns-6f6b679f8f-q4s6r" in "kube-system" namespace has status "Ready":"False"
	I0831 16:05:58.283474    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:58.283498    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:58.283509    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:58.283516    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:58.286282    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:58.286295    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:58.286302    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:58 GMT
	I0831 16:05:58.286307    5342 round_trippers.go:580]     Audit-Id: a938ec3f-b9b0-46c2-8a6c-2b3e2a7d3f8c
	I0831 16:05:58.286311    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:58.286315    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:58.286321    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:58.286324    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:58.286444    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:58.286815    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:58.286824    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:58.286832    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:58.286837    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:58.288642    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:58.288649    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:58.288653    5342 round_trippers.go:580]     Audit-Id: 45507f63-e1e1-4ebe-943a-1ca1e10e2677
	I0831 16:05:58.288656    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:58.288675    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:58.288683    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:58.288689    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:58.288694    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:58 GMT
	I0831 16:05:58.288940    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:05:58.781751    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:58.781773    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:58.781785    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:58.781790    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:58.784469    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:58.784488    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:58.784496    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:58 GMT
	I0831 16:05:58.784500    5342 round_trippers.go:580]     Audit-Id: edfe4280-2cc9-4c51-83c0-60f179852df1
	I0831 16:05:58.784504    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:58.784510    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:58.784515    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:58.784520    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:58.784821    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:58.785198    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:58.785207    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:58.785214    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:58.785219    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:58.786521    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:58.786529    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:58.786535    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:58.786539    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:58 GMT
	I0831 16:05:58.786543    5342 round_trippers.go:580]     Audit-Id: cad4537d-5e6a-46fd-a6dc-9aee54963a26
	I0831 16:05:58.786548    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:58.786552    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:58.786556    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:58.786711    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:05:59.283717    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:59.283741    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:59.283761    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:59.283766    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:59.288379    5342 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0831 16:05:59.288392    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:59.288397    5342 round_trippers.go:580]     Audit-Id: 966505bb-4166-413a-8105-ecd376590c46
	I0831 16:05:59.288400    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:59.288403    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:59.288405    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:59.288406    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:59.288409    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:59 GMT
	I0831 16:05:59.288505    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:59.288805    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:59.288813    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:59.288819    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:59.288837    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:59.290394    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:59.290404    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:59.290409    5342 round_trippers.go:580]     Audit-Id: 5effccf9-887b-481a-a4b4-1124b33653d9
	I0831 16:05:59.290413    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:59.290415    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:59.290418    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:59.290421    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:59.290423    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:59 GMT
	I0831 16:05:59.290716    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:05:59.781856    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:05:59.781868    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:59.781875    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:59.781878    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:59.783494    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:05:59.783506    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:59.783512    5342 round_trippers.go:580]     Audit-Id: 569db220-688e-4c2e-9177-e8b4e2bf8cdc
	I0831 16:05:59.783524    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:59.783528    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:59.783532    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:59.783535    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:59.783538    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:59 GMT
	I0831 16:05:59.783652    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"746","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7092 chars]
	I0831 16:05:59.783974    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:05:59.783981    5342 round_trippers.go:469] Request Headers:
	I0831 16:05:59.783987    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:05:59.784004    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:05:59.786232    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:05:59.786240    5342 round_trippers.go:577] Response Headers:
	I0831 16:05:59.786244    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:05:59 GMT
	I0831 16:05:59.786247    5342 round_trippers.go:580]     Audit-Id: 8df7aaee-ff16-4950-87ee-8d9bd21aabe0
	I0831 16:05:59.786250    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:05:59.786252    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:05:59.786254    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:05:59.786257    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:05:59.786386    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:06:00.281937    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:06:00.281956    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:00.281968    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:00.281974    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:00.284802    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:00.284814    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:00.284821    5342 round_trippers.go:580]     Audit-Id: 189637bc-3da9-4814-9ea1-28996e3f4ed5
	I0831 16:06:00.284826    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:00.284831    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:00.284835    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:00.284838    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:00.284843    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:00 GMT
	I0831 16:06:00.285439    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"892","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7039 chars]
	I0831 16:06:00.285824    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:06:00.285834    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:00.285842    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:00.285848    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:00.287109    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:06:00.287117    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:00.287122    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:00.287126    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:00.287174    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:00 GMT
	I0831 16:06:00.287183    5342 round_trippers.go:580]     Audit-Id: 325f808f-e4b8-423e-bd5d-916a2b2bc909
	I0831 16:06:00.287189    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:00.287194    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:00.287393    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:06:00.287576    5342 pod_ready.go:93] pod "coredns-6f6b679f8f-q4s6r" in "kube-system" namespace has status "Ready":"True"
	I0831 16:06:00.287586    5342 pod_ready.go:82] duration metric: took 13.506152988s for pod "coredns-6f6b679f8f-q4s6r" in "kube-system" namespace to be "Ready" ...
	I0831 16:06:00.287592    5342 pod_ready.go:79] waiting up to 6m0s for pod "etcd-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:06:00.287628    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/etcd-multinode-957000
	I0831 16:06:00.287633    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:00.287638    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:00.287641    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:00.291369    5342 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 16:06:00.291377    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:00.291381    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:00.291384    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:00.291386    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:00.291389    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:00.291392    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:00 GMT
	I0831 16:06:00.291395    5342 round_trippers.go:580]     Audit-Id: 4e505400-179f-4072-ac33-b47a484c89ae
	I0831 16:06:00.291554    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"etcd-multinode-957000","namespace":"kube-system","uid":"b4833809-a14f-49f4-b877-9f7e4be0bd39","resourceVersion":"857","creationTimestamp":"2024-08-31T22:57:31Z","labels":{"component":"etcd","tier":"control-plane"},"annotations":{"kubeadm.kubernetes.io/etcd.advertise-client-urls":"https://192.169.0.13:2379","kubernetes.io/config.hash":"7ee006dc216d695a2fa4355a2abea57a","kubernetes.io/config.mirror":"7ee006dc216d695a2fa4355a2abea57a","kubernetes.io/config.seen":"2024-08-31T22:57:31.349647295Z","kubernetes.io/config.source":"file"},"ownerReferences":[{"apiVersion":"v1","kind":"Node","name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","controller":true}],"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:31Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:annotations":{".":{},"f:kubeadm.kubernetes.io/etcd.advertise-cl
ient-urls":{},"f:kubernetes.io/config.hash":{},"f:kubernetes.io/config. [truncated 6663 chars]
	I0831 16:06:00.291808    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:06:00.291815    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:00.291821    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:00.291825    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:00.292971    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:06:00.292978    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:00.292983    5342 round_trippers.go:580]     Audit-Id: ac70b130-597d-464b-9000-19db6a35ec34
	I0831 16:06:00.292986    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:00.292989    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:00.292992    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:00.292994    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:00.293010    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:00 GMT
	I0831 16:06:00.293130    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:06:00.293287    5342 pod_ready.go:93] pod "etcd-multinode-957000" in "kube-system" namespace has status "Ready":"True"
	I0831 16:06:00.293294    5342 pod_ready.go:82] duration metric: took 5.697616ms for pod "etcd-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:06:00.293304    5342 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:06:00.293330    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-multinode-957000
	I0831 16:06:00.293335    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:00.293340    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:00.293345    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:00.294474    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:06:00.294483    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:00.294491    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:00.294497    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:00.294502    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:00.294505    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:00.294510    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:00 GMT
	I0831 16:06:00.294514    5342 round_trippers.go:580]     Audit-Id: 3bb6ffe6-306b-4f43-ba45-efbc0de8ce0c
	I0831 16:06:00.294674    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-apiserver-multinode-957000","namespace":"kube-system","uid":"e549c883-0eb6-43a1-be40-c8d2f3a9468e","resourceVersion":"862","creationTimestamp":"2024-08-31T22:57:31Z","labels":{"component":"kube-apiserver","tier":"control-plane"},"annotations":{"kubeadm.kubernetes.io/kube-apiserver.advertise-address.endpoint":"192.169.0.13:8443","kubernetes.io/config.hash":"5db461e18c39888a5ab16fd535bfcb2e","kubernetes.io/config.mirror":"5db461e18c39888a5ab16fd535bfcb2e","kubernetes.io/config.seen":"2024-08-31T22:57:31.349647948Z","kubernetes.io/config.source":"file"},"ownerReferences":[{"apiVersion":"v1","kind":"Node","name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","controller":true}],"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:31Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:annotations":{".":{},"f:kubeadm.kube
rnetes.io/kube-apiserver.advertise-address.endpoint":{},"f:kubernetes.i [truncated 7891 chars]
	I0831 16:06:00.294913    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:06:00.294920    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:00.294930    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:00.294935    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:00.296033    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:06:00.296041    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:00.296046    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:00.296077    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:00.296083    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:00.296087    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:00.296094    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:00 GMT
	I0831 16:06:00.296099    5342 round_trippers.go:580]     Audit-Id: 0fa1d436-caf1-4bab-8003-77a56b7a5215
	I0831 16:06:00.296221    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:06:00.296385    5342 pod_ready.go:93] pod "kube-apiserver-multinode-957000" in "kube-system" namespace has status "Ready":"True"
	I0831 16:06:00.296393    5342 pod_ready.go:82] duration metric: took 3.083412ms for pod "kube-apiserver-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:06:00.296399    5342 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:06:00.296433    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-multinode-957000
	I0831 16:06:00.296438    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:00.296443    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:00.296446    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:00.297546    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:06:00.297553    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:00.297558    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:00.297571    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:00.297575    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:00.297580    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:00 GMT
	I0831 16:06:00.297584    5342 round_trippers.go:580]     Audit-Id: 08383e94-7df5-489b-9482-391a10d7a3d3
	I0831 16:06:00.297588    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:00.297716    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-controller-manager-multinode-957000","namespace":"kube-system","uid":"8a82b721-75a3-4460-b9eb-bfc4db35f20e","resourceVersion":"859","creationTimestamp":"2024-08-31T22:57:31Z","labels":{"component":"kube-controller-manager","tier":"control-plane"},"annotations":{"kubernetes.io/config.hash":"9edb08d8378ca77b90e86ed290d828c5","kubernetes.io/config.mirror":"9edb08d8378ca77b90e86ed290d828c5","kubernetes.io/config.seen":"2024-08-31T22:57:31.349643093Z","kubernetes.io/config.source":"file"},"ownerReferences":[{"apiVersion":"v1","kind":"Node","name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","controller":true}],"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:31Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:annotations":{".":{},"f:kubernetes.io/config.hash":{},"f:kubernetes.io/config.mirror":{},"f:kubernetes.i
o/config.seen":{},"f:kubernetes.io/config.source":{}},"f:labels":{".":{ [truncated 7464 chars]
	I0831 16:06:00.297954    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:06:00.297961    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:00.297965    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:00.297971    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:00.299037    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:06:00.299045    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:00.299050    5342 round_trippers.go:580]     Audit-Id: fe100f65-cc30-4471-8baa-7017d3154fdc
	I0831 16:06:00.299053    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:00.299056    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:00.299059    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:00.299063    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:00.299066    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:00 GMT
	I0831 16:06:00.299198    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:06:00.299369    5342 pod_ready.go:93] pod "kube-controller-manager-multinode-957000" in "kube-system" namespace has status "Ready":"True"
	I0831 16:06:00.299377    5342 pod_ready.go:82] duration metric: took 2.972185ms for pod "kube-controller-manager-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:06:00.299395    5342 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-cplv4" in "kube-system" namespace to be "Ready" ...
	I0831 16:06:00.299430    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cplv4
	I0831 16:06:00.299435    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:00.299440    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:00.299444    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:00.300933    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:06:00.300942    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:00.300947    5342 round_trippers.go:580]     Audit-Id: 6c3e9266-6da1-44fd-956c-3d86dae3c4a2
	I0831 16:06:00.300951    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:00.300968    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:00.300973    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:00.300976    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:00.300979    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:00 GMT
	I0831 16:06:00.301103    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-proxy-cplv4","generateName":"kube-proxy-","namespace":"kube-system","uid":"56ad32e2-f2ba-4fa5-b093-790a5205b4f2","resourceVersion":"476","creationTimestamp":"2024-08-31T22:58:18Z","labels":{"controller-revision-hash":"5976bc5f75","k8s-app":"kube-proxy","pod-template-generation":"1"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"DaemonSet","name":"kube-proxy","uid":"7b2d5815-fd80-401f-9040-ee043a6144ec","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:58:18Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:controller-revision-hash":{},"f:k8s-app":{},"f:pod-template-generation":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"7b2d5815-fd80-401f-9040-ee043a6144ec\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:nodeAffinity":{".":{},"f:r
equiredDuringSchedulingIgnoredDuringExecution":{}}},"f:containers":{"k: [truncated 6197 chars]
	I0831 16:06:00.301337    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:00.301344    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:00.301349    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:00.301354    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:00.302582    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:06:00.302592    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:00.302598    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:00 GMT
	I0831 16:06:00.302602    5342 round_trippers.go:580]     Audit-Id: 45440044-1028-4d76-a30a-6fd9aac0c5ba
	I0831 16:06:00.302606    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:00.302609    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:00.302611    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:00.302614    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:00.302706    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"80356a3f-91f2-42b6-b267-2e41c24b1477","resourceVersion":"542","creationTimestamp":"2024-08-31T22:58:18Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T15_58_18_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubeadm","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:58:18Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{"f:kubeadm.alpha.kubernetes.io/cri-socket":{}}}}},{"man [truncated 3818 chars]
	I0831 16:06:00.302869    5342 pod_ready.go:93] pod "kube-proxy-cplv4" in "kube-system" namespace has status "Ready":"True"
	I0831 16:06:00.302879    5342 pod_ready.go:82] duration metric: took 3.478614ms for pod "kube-proxy-cplv4" in "kube-system" namespace to be "Ready" ...
	I0831 16:06:00.302898    5342 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-ndfs6" in "kube-system" namespace to be "Ready" ...
	I0831 16:06:00.482652    5342 request.go:632] Waited for 179.688038ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-ndfs6
	I0831 16:06:00.482740    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-ndfs6
	I0831 16:06:00.482748    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:00.482760    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:00.482771    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:00.485631    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:00.485648    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:00.485655    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:00 GMT
	I0831 16:06:00.485659    5342 round_trippers.go:580]     Audit-Id: 1fb9d308-3c2d-430c-a79b-26a3a8014f98
	I0831 16:06:00.485663    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:00.485667    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:00.485707    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:00.485722    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:00.485963    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-proxy-ndfs6","generateName":"kube-proxy-","namespace":"kube-system","uid":"34c16419-4c10-41bd-9446-75ba130cbe63","resourceVersion":"707","creationTimestamp":"2024-08-31T22:59:10Z","labels":{"controller-revision-hash":"5976bc5f75","k8s-app":"kube-proxy","pod-template-generation":"1"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"DaemonSet","name":"kube-proxy","uid":"7b2d5815-fd80-401f-9040-ee043a6144ec","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:59:10Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:controller-revision-hash":{},"f:k8s-app":{},"f:pod-template-generation":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"7b2d5815-fd80-401f-9040-ee043a6144ec\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:nodeAffinity":{".":{},"f:r
equiredDuringSchedulingIgnoredDuringExecution":{}}},"f:containers":{"k: [truncated 6197 chars]
	I0831 16:06:00.683376    5342 request.go:632] Waited for 197.051988ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:06:00.683592    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:06:00.683602    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:00.683613    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:00.683630    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:00.686256    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:00.686287    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:00.686295    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:00.686300    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:00.686304    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:00 GMT
	I0831 16:06:00.686307    5342 round_trippers.go:580]     Audit-Id: 8a8eb141-be1a-459f-af86-a9a50f6cf9ab
	I0831 16:06:00.686315    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:00.686320    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:00.686408    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"0867ece2-944d-429d-b3c6-0eab243276ee","resourceVersion":"732","creationTimestamp":"2024-08-31T23:00:04Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_00_04_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubeadm","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:00:04Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{"f:kubeadm.alpha.kubernetes.io/cri-socket":{}}}}},{"man [truncated 3635 chars]
	I0831 16:06:00.686637    5342 pod_ready.go:93] pod "kube-proxy-ndfs6" in "kube-system" namespace has status "Ready":"True"
	I0831 16:06:00.686648    5342 pod_ready.go:82] duration metric: took 383.740352ms for pod "kube-proxy-ndfs6" in "kube-system" namespace to be "Ready" ...
	I0831 16:06:00.686656    5342 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-zf7j6" in "kube-system" namespace to be "Ready" ...
	I0831 16:06:00.883351    5342 request.go:632] Waited for 196.648667ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zf7j6
	I0831 16:06:00.883428    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zf7j6
	I0831 16:06:00.883439    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:00.883451    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:00.883457    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:00.886058    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:00.886079    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:00.886090    5342 round_trippers.go:580]     Audit-Id: 51444c89-c00d-480d-89c7-cdb04bd83029
	I0831 16:06:00.886095    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:00.886099    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:00.886104    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:00.886108    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:00.886112    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:01 GMT
	I0831 16:06:00.886246    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-proxy-zf7j6","generateName":"kube-proxy-","namespace":"kube-system","uid":"e84c5d55-f27d-4d2a-9b41-6f1e6100ad2e","resourceVersion":"756","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"controller-revision-hash":"5976bc5f75","k8s-app":"kube-proxy","pod-template-generation":"1"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"DaemonSet","name":"kube-proxy","uid":"7b2d5815-fd80-401f-9040-ee043a6144ec","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:controller-revision-hash":{},"f:k8s-app":{},"f:pod-template-generation":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"7b2d5815-fd80-401f-9040-ee043a6144ec\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:nodeAffinity":{".":{},"f:r
equiredDuringSchedulingIgnoredDuringExecution":{}}},"f:containers":{"k: [truncated 6394 chars]
	I0831 16:06:01.082286    5342 request.go:632] Waited for 195.655028ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:06:01.082363    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:06:01.082371    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:01.082446    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:01.082456    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:01.085334    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:01.085351    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:01.085358    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:01.085371    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:01.085376    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:01 GMT
	I0831 16:06:01.085379    5342 round_trippers.go:580]     Audit-Id: b418e7da-d979-45f4-bba8-4ca4f78106c3
	I0831 16:06:01.085383    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:01.085386    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:01.085518    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:06:01.085779    5342 pod_ready.go:93] pod "kube-proxy-zf7j6" in "kube-system" namespace has status "Ready":"True"
	I0831 16:06:01.085791    5342 pod_ready.go:82] duration metric: took 399.127501ms for pod "kube-proxy-zf7j6" in "kube-system" namespace to be "Ready" ...
	I0831 16:06:01.085800    5342 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:06:01.284050    5342 request.go:632] Waited for 198.181809ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-multinode-957000
	I0831 16:06:01.284206    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-multinode-957000
	I0831 16:06:01.284217    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:01.284229    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:01.284237    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:01.286962    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:01.286981    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:01.286989    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:01.286994    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:01.286998    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:01.287003    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:01.287008    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:01 GMT
	I0831 16:06:01.287013    5342 round_trippers.go:580]     Audit-Id: ecc26166-986e-468b-9c92-5324a27864b2
	I0831 16:06:01.287130    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-scheduler-multinode-957000","namespace":"kube-system","uid":"f48d9647-8460-48da-a5b0-fc471f5536ad","resourceVersion":"847","creationTimestamp":"2024-08-31T22:57:31Z","labels":{"component":"kube-scheduler","tier":"control-plane"},"annotations":{"kubernetes.io/config.hash":"b74e8393ad84ccbcf23f7560eda422b0","kubernetes.io/config.mirror":"b74e8393ad84ccbcf23f7560eda422b0","kubernetes.io/config.seen":"2024-08-31T22:57:31.349646560Z","kubernetes.io/config.source":"file"},"ownerReferences":[{"apiVersion":"v1","kind":"Node","name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","controller":true}],"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:31Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:annotations":{".":{},"f:kubernetes.io/config.hash":{},"f:kubernetes.io/config.mirror":{},"f:kubernetes.io/config.seen":{},
"f:kubernetes.io/config.source":{}},"f:labels":{".":{},"f:component":{} [truncated 5194 chars]
	I0831 16:06:01.482101    5342 request.go:632] Waited for 194.66001ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:06:01.482215    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:06:01.482227    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:01.482238    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:01.482246    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:01.485005    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:01.485021    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:01.485028    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:01.485032    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:01.485037    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:01 GMT
	I0831 16:06:01.485042    5342 round_trippers.go:580]     Audit-Id: 34291b73-4ac4-44ea-a8d8-14b6ec4eaf0a
	I0831 16:06:01.485046    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:01.485050    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:01.485144    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:06:01.485414    5342 pod_ready.go:93] pod "kube-scheduler-multinode-957000" in "kube-system" namespace has status "Ready":"True"
	I0831 16:06:01.485426    5342 pod_ready.go:82] duration metric: took 399.617523ms for pod "kube-scheduler-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:06:01.485440    5342 pod_ready.go:39] duration metric: took 14.708891053s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 16:06:01.485456    5342 api_server.go:52] waiting for apiserver process to appear ...
	I0831 16:06:01.485518    5342 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 16:06:01.499449    5342 command_runner.go:130] > 1696
	I0831 16:06:01.499544    5342 api_server.go:72] duration metric: took 30.970924323s to wait for apiserver process to appear ...
	I0831 16:06:01.499555    5342 api_server.go:88] waiting for apiserver healthz status ...
	I0831 16:06:01.499564    5342 api_server.go:253] Checking apiserver healthz at https://192.169.0.13:8443/healthz ...
	I0831 16:06:01.502739    5342 api_server.go:279] https://192.169.0.13:8443/healthz returned 200:
	ok
	I0831 16:06:01.502768    5342 round_trippers.go:463] GET https://192.169.0.13:8443/version
	I0831 16:06:01.502772    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:01.502778    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:01.502782    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:01.503391    5342 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 16:06:01.503404    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:01.503409    5342 round_trippers.go:580]     Audit-Id: 4588ce78-b8d1-4e73-a318-cb75ab6a808a
	I0831 16:06:01.503412    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:01.503416    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:01.503419    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:01.503434    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:01.503437    5342 round_trippers.go:580]     Content-Length: 263
	I0831 16:06:01.503440    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:01 GMT
	I0831 16:06:01.503448    5342 request.go:1351] Response Body: {
	  "major": "1",
	  "minor": "31",
	  "gitVersion": "v1.31.0",
	  "gitCommit": "9edcffcde5595e8a5b1a35f88c421764e575afce",
	  "gitTreeState": "clean",
	  "buildDate": "2024-08-13T07:28:49Z",
	  "goVersion": "go1.22.5",
	  "compiler": "gc",
	  "platform": "linux/amd64"
	}
	I0831 16:06:01.503475    5342 api_server.go:141] control plane version: v1.31.0
	I0831 16:06:01.503484    5342 api_server.go:131] duration metric: took 3.925183ms to wait for apiserver health ...
	I0831 16:06:01.503489    5342 system_pods.go:43] waiting for kube-system pods to appear ...
	I0831 16:06:01.682312    5342 request.go:632] Waited for 178.779396ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods
	I0831 16:06:01.682394    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods
	I0831 16:06:01.682405    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:01.682416    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:01.682423    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:01.686325    5342 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 16:06:01.686342    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:01.686350    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:01.686355    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:01.686359    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:01.686362    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:01.686366    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:01 GMT
	I0831 16:06:01.686370    5342 round_trippers.go:580]     Audit-Id: 685fd12e-14b9-4110-bb72-7df0ed889a85
	I0831 16:06:01.687431    5342 request.go:1351] Response Body: {"kind":"PodList","apiVersion":"v1","metadata":{"resourceVersion":"897"},"items":[{"metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"892","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f
:preferredDuringSchedulingIgnoredDuringExecution":{}}},"f:containers":{ [truncated 89323 chars]
	I0831 16:06:01.689410    5342 system_pods.go:59] 12 kube-system pods found
	I0831 16:06:01.689434    5342 system_pods.go:61] "coredns-6f6b679f8f-q4s6r" [b794efa0-8367-452b-90be-870e8d349f6f] Running
	I0831 16:06:01.689438    5342 system_pods.go:61] "etcd-multinode-957000" [b4833809-a14f-49f4-b877-9f7e4be0bd39] Running
	I0831 16:06:01.689441    5342 system_pods.go:61] "kindnet-5vc9x" [a8f9df46-0974-4620-a7c1-6022793f34f1] Running
	I0831 16:06:01.689444    5342 system_pods.go:61] "kindnet-cjqw5" [4a7f98b7-3e6d-4e84-b4ee-6838db3d880b] Running
	I0831 16:06:01.689446    5342 system_pods.go:61] "kindnet-gkhfh" [8c3c358a-7566-4871-a514-82c6190fab18] Running
	I0831 16:06:01.689449    5342 system_pods.go:61] "kube-apiserver-multinode-957000" [e549c883-0eb6-43a1-be40-c8d2f3a9468e] Running
	I0831 16:06:01.689452    5342 system_pods.go:61] "kube-controller-manager-multinode-957000" [8a82b721-75a3-4460-b9eb-bfc4db35f20e] Running
	I0831 16:06:01.689458    5342 system_pods.go:61] "kube-proxy-cplv4" [56ad32e2-f2ba-4fa5-b093-790a5205b4f2] Running
	I0831 16:06:01.689461    5342 system_pods.go:61] "kube-proxy-ndfs6" [34c16419-4c10-41bd-9446-75ba130cbe63] Running
	I0831 16:06:01.689477    5342 system_pods.go:61] "kube-proxy-zf7j6" [e84c5d55-f27d-4d2a-9b41-6f1e6100ad2e] Running
	I0831 16:06:01.689506    5342 system_pods.go:61] "kube-scheduler-multinode-957000" [f48d9647-8460-48da-a5b0-fc471f5536ad] Running
	I0831 16:06:01.689513    5342 system_pods.go:61] "storage-provisioner" [f389bc9a-20cc-4e07-bc7f-f418f53773c9] Running / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I0831 16:06:01.689525    5342 system_pods.go:74] duration metric: took 186.031388ms to wait for pod list to return data ...
	I0831 16:06:01.689536    5342 default_sa.go:34] waiting for default service account to be created ...
	I0831 16:06:01.882104    5342 request.go:632] Waited for 192.523513ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/namespaces/default/serviceaccounts
	I0831 16:06:01.882167    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/default/serviceaccounts
	I0831 16:06:01.882171    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:01.882177    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:01.882180    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:01.884690    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:01.884700    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:01.884706    5342 round_trippers.go:580]     Content-Length: 261
	I0831 16:06:01.884709    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:02 GMT
	I0831 16:06:01.884712    5342 round_trippers.go:580]     Audit-Id: 0c9cad3a-2914-4ee6-8d60-6a746948560d
	I0831 16:06:01.884715    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:01.884717    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:01.884719    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:01.884722    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:01.884732    5342 request.go:1351] Response Body: {"kind":"ServiceAccountList","apiVersion":"v1","metadata":{"resourceVersion":"897"},"items":[{"metadata":{"name":"default","namespace":"default","uid":"232439b8-1714-4cf0-98cd-73337a287803","resourceVersion":"322","creationTimestamp":"2024-08-31T22:57:36Z"}}]}
	I0831 16:06:01.884857    5342 default_sa.go:45] found service account: "default"
	I0831 16:06:01.884866    5342 default_sa.go:55] duration metric: took 195.323773ms for default service account to be created ...
	I0831 16:06:01.884872    5342 system_pods.go:116] waiting for k8s-apps to be running ...
	I0831 16:06:02.082249    5342 request.go:632] Waited for 197.326454ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods
	I0831 16:06:02.082303    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods
	I0831 16:06:02.082345    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:02.082359    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:02.082367    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:02.085365    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:02.085378    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:02.085389    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:02 GMT
	I0831 16:06:02.085395    5342 round_trippers.go:580]     Audit-Id: 4c6c242f-f830-4ea2-86c5-72e92c82183a
	I0831 16:06:02.085403    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:02.085410    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:02.085415    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:02.085438    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:02.086430    5342 request.go:1351] Response Body: {"kind":"PodList","apiVersion":"v1","metadata":{"resourceVersion":"897"},"items":[{"metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"892","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f
:preferredDuringSchedulingIgnoredDuringExecution":{}}},"f:containers":{ [truncated 89323 chars]
	I0831 16:06:02.088429    5342 system_pods.go:86] 12 kube-system pods found
	I0831 16:06:02.088442    5342 system_pods.go:89] "coredns-6f6b679f8f-q4s6r" [b794efa0-8367-452b-90be-870e8d349f6f] Running
	I0831 16:06:02.088446    5342 system_pods.go:89] "etcd-multinode-957000" [b4833809-a14f-49f4-b877-9f7e4be0bd39] Running
	I0831 16:06:02.088449    5342 system_pods.go:89] "kindnet-5vc9x" [a8f9df46-0974-4620-a7c1-6022793f34f1] Running
	I0831 16:06:02.088452    5342 system_pods.go:89] "kindnet-cjqw5" [4a7f98b7-3e6d-4e84-b4ee-6838db3d880b] Running
	I0831 16:06:02.088454    5342 system_pods.go:89] "kindnet-gkhfh" [8c3c358a-7566-4871-a514-82c6190fab18] Running
	I0831 16:06:02.088457    5342 system_pods.go:89] "kube-apiserver-multinode-957000" [e549c883-0eb6-43a1-be40-c8d2f3a9468e] Running
	I0831 16:06:02.088459    5342 system_pods.go:89] "kube-controller-manager-multinode-957000" [8a82b721-75a3-4460-b9eb-bfc4db35f20e] Running
	I0831 16:06:02.088463    5342 system_pods.go:89] "kube-proxy-cplv4" [56ad32e2-f2ba-4fa5-b093-790a5205b4f2] Running
	I0831 16:06:02.088466    5342 system_pods.go:89] "kube-proxy-ndfs6" [34c16419-4c10-41bd-9446-75ba130cbe63] Running
	I0831 16:06:02.088468    5342 system_pods.go:89] "kube-proxy-zf7j6" [e84c5d55-f27d-4d2a-9b41-6f1e6100ad2e] Running
	I0831 16:06:02.088471    5342 system_pods.go:89] "kube-scheduler-multinode-957000" [f48d9647-8460-48da-a5b0-fc471f5536ad] Running
	I0831 16:06:02.088475    5342 system_pods.go:89] "storage-provisioner" [f389bc9a-20cc-4e07-bc7f-f418f53773c9] Running / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I0831 16:06:02.088480    5342 system_pods.go:126] duration metric: took 203.602987ms to wait for k8s-apps to be running ...
	I0831 16:06:02.088484    5342 system_svc.go:44] waiting for kubelet service to be running ....
	I0831 16:06:02.088532    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 16:06:02.100367    5342 system_svc.go:56] duration metric: took 11.877838ms WaitForService to wait for kubelet
	I0831 16:06:02.100382    5342 kubeadm.go:582] duration metric: took 31.571757872s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 16:06:02.100394    5342 node_conditions.go:102] verifying NodePressure condition ...
	I0831 16:06:02.283022    5342 request.go:632] Waited for 182.571365ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/nodes
	I0831 16:06:02.283146    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes
	I0831 16:06:02.283157    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:02.283168    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:02.283177    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:02.286202    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:02.286220    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:02.286230    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:02.286237    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:02.286242    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:02 GMT
	I0831 16:06:02.286246    5342 round_trippers.go:580]     Audit-Id: bd9dae7d-c386-42ec-ab69-0dc893c45f09
	I0831 16:06:02.286251    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:02.286260    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:02.286501    5342 request.go:1351] Response Body: {"kind":"NodeList","apiVersion":"v1","metadata":{"resourceVersion":"897"},"items":[{"metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFiel
ds":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time" [truncated 14655 chars]
	I0831 16:06:02.287093    5342 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 16:06:02.287105    5342 node_conditions.go:123] node cpu capacity is 2
	I0831 16:06:02.287113    5342 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 16:06:02.287117    5342 node_conditions.go:123] node cpu capacity is 2
	I0831 16:06:02.287129    5342 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 16:06:02.287132    5342 node_conditions.go:123] node cpu capacity is 2
	I0831 16:06:02.287136    5342 node_conditions.go:105] duration metric: took 186.738055ms to run NodePressure ...
	I0831 16:06:02.287148    5342 start.go:241] waiting for startup goroutines ...
	I0831 16:06:02.287156    5342 start.go:246] waiting for cluster config update ...
	I0831 16:06:02.287177    5342 start.go:255] writing updated cluster config ...
	I0831 16:06:02.309156    5342 out.go:201] 
	I0831 16:06:02.331310    5342 config.go:182] Loaded profile config "multinode-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 16:06:02.331435    5342 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/config.json ...
	I0831 16:06:02.353710    5342 out.go:177] * Starting "multinode-957000-m02" worker node in "multinode-957000" cluster
	I0831 16:06:02.396729    5342 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 16:06:02.396764    5342 cache.go:56] Caching tarball of preloaded images
	I0831 16:06:02.396954    5342 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 16:06:02.396973    5342 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 16:06:02.397123    5342 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/config.json ...
	I0831 16:06:02.398171    5342 start.go:360] acquireMachinesLock for multinode-957000-m02: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 16:06:02.398292    5342 start.go:364] duration metric: took 95.733µs to acquireMachinesLock for "multinode-957000-m02"
	I0831 16:06:02.398319    5342 start.go:96] Skipping create...Using existing machine configuration
	I0831 16:06:02.398326    5342 fix.go:54] fixHost starting: m02
	I0831 16:06:02.398778    5342 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:06:02.398796    5342 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:06:02.408006    5342 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53167
	I0831 16:06:02.408462    5342 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:06:02.408810    5342 main.go:141] libmachine: Using API Version  1
	I0831 16:06:02.408827    5342 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:06:02.409101    5342 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:06:02.409218    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .DriverName
	I0831 16:06:02.409324    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetState
	I0831 16:06:02.409409    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:06:02.409506    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | hyperkit pid from json: 4597
	I0831 16:06:02.410424    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | hyperkit pid 4597 missing from process table
	I0831 16:06:02.410453    5342 fix.go:112] recreateIfNeeded on multinode-957000-m02: state=Stopped err=<nil>
	I0831 16:06:02.410464    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .DriverName
	W0831 16:06:02.410549    5342 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 16:06:02.431603    5342 out.go:177] * Restarting existing hyperkit VM for "multinode-957000-m02" ...
	I0831 16:06:02.473670    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .Start
	I0831 16:06:02.473941    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:06:02.474023    5342 main.go:141] libmachine: (multinode-957000-m02) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/hyperkit.pid
	I0831 16:06:02.475780    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | hyperkit pid 4597 missing from process table
	I0831 16:06:02.475798    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | pid 4597 is in state "Stopped"
	I0831 16:06:02.475816    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/hyperkit.pid...
	I0831 16:06:02.476157    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | Using UUID 26b6a3a9-109f-42df-92e6-b799b636693f
	I0831 16:06:02.502482    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | Generated MAC 6:27:eb:c0:a3:31
	I0831 16:06:02.502503    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=multinode-957000
	I0831 16:06:02.502635    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:02 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"26b6a3a9-109f-42df-92e6-b799b636693f", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000439ad0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", pr
ocess:(*os.Process)(nil)}
	I0831 16:06:02.502663    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:02 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"26b6a3a9-109f-42df-92e6-b799b636693f", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000439ad0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", pr
ocess:(*os.Process)(nil)}
	I0831 16:06:02.502744    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:02 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "26b6a3a9-109f-42df-92e6-b799b636693f", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/multinode-957000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/bzimage,/Users/jenkins
/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=multinode-957000"}
	I0831 16:06:02.502787    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:02 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 26b6a3a9-109f-42df-92e6-b799b636693f -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/multinode-957000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-9
57000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=multinode-957000"
	I0831 16:06:02.502799    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:02 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 16:06:02.504306    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:02 DEBUG: hyperkit: Pid is 5398
	I0831 16:06:02.504751    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | Attempt 0
	I0831 16:06:02.504767    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:06:02.504831    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | hyperkit pid from json: 5398
	I0831 16:06:02.506467    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | Searching for 6:27:eb:c0:a3:31 in /var/db/dhcpd_leases ...
	I0831 16:06:02.506529    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | Found 14 entries in /var/db/dhcpd_leases!
	I0831 16:06:02.506567    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:06:02.506607    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a08a}
	I0831 16:06:02.506633    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f17f}
	I0831 16:06:02.506632    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetConfigRaw
	I0831 16:06:02.506646    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | Found match: 6:27:eb:c0:a3:31
	I0831 16:06:02.506658    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | IP: 192.169.0.14
	I0831 16:06:02.507296    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetIP
	I0831 16:06:02.507475    5342 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/config.json ...
	I0831 16:06:02.507998    5342 machine.go:93] provisionDockerMachine start ...
	I0831 16:06:02.508009    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .DriverName
	I0831 16:06:02.508149    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHHostname
	I0831 16:06:02.508261    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHPort
	I0831 16:06:02.508350    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHKeyPath
	I0831 16:06:02.508448    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHKeyPath
	I0831 16:06:02.508534    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHUsername
	I0831 16:06:02.508657    5342 main.go:141] libmachine: Using SSH client type: native
	I0831 16:06:02.508818    5342 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1e70ea0] 0x1e73c00 <nil>  [] 0s} 192.169.0.14 22 <nil> <nil>}
	I0831 16:06:02.508826    5342 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 16:06:02.512086    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:02 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 16:06:02.521227    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:02 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 16:06:02.522418    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:02 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:06:02.522442    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:02 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:06:02.522469    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:02 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:06:02.522485    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:02 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:06:02.906677    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:02 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 16:06:02.906694    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:02 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 16:06:03.021280    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:06:03.021301    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:06:03.021326    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:06:03.021344    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:06:03.022164    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:03 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 16:06:03.022175    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:03 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 16:06:08.638733    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:08 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 16:06:08.638819    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:08 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 16:06:08.638830    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:08 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 16:06:08.662719    5342 main.go:141] libmachine: (multinode-957000-m02) DBG | 2024/08/31 16:06:08 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 16:06:37.571208    5342 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 16:06:37.571222    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetMachineName
	I0831 16:06:37.571357    5342 buildroot.go:166] provisioning hostname "multinode-957000-m02"
	I0831 16:06:37.571365    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetMachineName
	I0831 16:06:37.571465    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHHostname
	I0831 16:06:37.571554    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHPort
	I0831 16:06:37.571656    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHKeyPath
	I0831 16:06:37.571742    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHKeyPath
	I0831 16:06:37.571828    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHUsername
	I0831 16:06:37.571958    5342 main.go:141] libmachine: Using SSH client type: native
	I0831 16:06:37.572097    5342 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1e70ea0] 0x1e73c00 <nil>  [] 0s} 192.169.0.14 22 <nil> <nil>}
	I0831 16:06:37.572106    5342 main.go:141] libmachine: About to run SSH command:
	sudo hostname multinode-957000-m02 && echo "multinode-957000-m02" | sudo tee /etc/hostname
	I0831 16:06:37.633851    5342 main.go:141] libmachine: SSH cmd err, output: <nil>: multinode-957000-m02
	
	I0831 16:06:37.633874    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHHostname
	I0831 16:06:37.634013    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHPort
	I0831 16:06:37.634103    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHKeyPath
	I0831 16:06:37.634197    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHKeyPath
	I0831 16:06:37.634301    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHUsername
	I0831 16:06:37.634448    5342 main.go:141] libmachine: Using SSH client type: native
	I0831 16:06:37.634607    5342 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1e70ea0] 0x1e73c00 <nil>  [] 0s} 192.169.0.14 22 <nil> <nil>}
	I0831 16:06:37.634621    5342 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\smultinode-957000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 multinode-957000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 multinode-957000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 16:06:37.691721    5342 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 16:06:37.691735    5342 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 16:06:37.691747    5342 buildroot.go:174] setting up certificates
	I0831 16:06:37.691754    5342 provision.go:84] configureAuth start
	I0831 16:06:37.691760    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetMachineName
	I0831 16:06:37.691905    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetIP
	I0831 16:06:37.691989    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHHostname
	I0831 16:06:37.692081    5342 provision.go:143] copyHostCerts
	I0831 16:06:37.692110    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 16:06:37.692158    5342 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 16:06:37.692164    5342 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 16:06:37.692346    5342 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 16:06:37.693095    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 16:06:37.693182    5342 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 16:06:37.693188    5342 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 16:06:37.693320    5342 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 16:06:37.693560    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 16:06:37.693607    5342 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 16:06:37.693617    5342 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 16:06:37.693876    5342 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 16:06:37.694077    5342 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.multinode-957000-m02 san=[127.0.0.1 192.169.0.14 localhost minikube multinode-957000-m02]
	I0831 16:06:37.838720    5342 provision.go:177] copyRemoteCerts
	I0831 16:06:37.838769    5342 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 16:06:37.838802    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHHostname
	I0831 16:06:37.838945    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHPort
	I0831 16:06:37.839040    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHKeyPath
	I0831 16:06:37.839145    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHUsername
	I0831 16:06:37.839232    5342 sshutil.go:53] new ssh client: &{IP:192.169.0.14 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/id_rsa Username:docker}
	I0831 16:06:37.870935    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 16:06:37.871014    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 16:06:37.889991    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 16:06:37.890056    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1229 bytes)
	I0831 16:06:37.908975    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 16:06:37.909042    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0831 16:06:37.928262    5342 provision.go:87] duration metric: took 236.496045ms to configureAuth
	I0831 16:06:37.928279    5342 buildroot.go:189] setting minikube options for container-runtime
	I0831 16:06:37.928444    5342 config.go:182] Loaded profile config "multinode-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 16:06:37.928470    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .DriverName
	I0831 16:06:37.928602    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHHostname
	I0831 16:06:37.928697    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHPort
	I0831 16:06:37.928797    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHKeyPath
	I0831 16:06:37.928885    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHKeyPath
	I0831 16:06:37.928957    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHUsername
	I0831 16:06:37.929067    5342 main.go:141] libmachine: Using SSH client type: native
	I0831 16:06:37.929191    5342 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1e70ea0] 0x1e73c00 <nil>  [] 0s} 192.169.0.14 22 <nil> <nil>}
	I0831 16:06:37.929198    5342 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 16:06:37.979017    5342 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 16:06:37.979030    5342 buildroot.go:70] root file system type: tmpfs
	I0831 16:06:37.979099    5342 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 16:06:37.979110    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHHostname
	I0831 16:06:37.979240    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHPort
	I0831 16:06:37.979336    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHKeyPath
	I0831 16:06:37.979430    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHKeyPath
	I0831 16:06:37.979517    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHUsername
	I0831 16:06:37.979648    5342 main.go:141] libmachine: Using SSH client type: native
	I0831 16:06:37.979787    5342 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1e70ea0] 0x1e73c00 <nil>  [] 0s} 192.169.0.14 22 <nil> <nil>}
	I0831 16:06:37.979834    5342 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.13"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 16:06:38.040885    5342 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.13
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 16:06:38.040902    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHHostname
	I0831 16:06:38.041035    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHPort
	I0831 16:06:38.041123    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHKeyPath
	I0831 16:06:38.041218    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHKeyPath
	I0831 16:06:38.041308    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHUsername
	I0831 16:06:38.041438    5342 main.go:141] libmachine: Using SSH client type: native
	I0831 16:06:38.041573    5342 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1e70ea0] 0x1e73c00 <nil>  [] 0s} 192.169.0.14 22 <nil> <nil>}
	I0831 16:06:38.041585    5342 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 16:06:39.625056    5342 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 16:06:39.625071    5342 machine.go:96] duration metric: took 37.116846966s to provisionDockerMachine
	I0831 16:06:39.625079    5342 start.go:293] postStartSetup for "multinode-957000-m02" (driver="hyperkit")
	I0831 16:06:39.625097    5342 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 16:06:39.625116    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .DriverName
	I0831 16:06:39.625304    5342 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 16:06:39.625318    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHHostname
	I0831 16:06:39.625411    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHPort
	I0831 16:06:39.625500    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHKeyPath
	I0831 16:06:39.625579    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHUsername
	I0831 16:06:39.625674    5342 sshutil.go:53] new ssh client: &{IP:192.169.0.14 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/id_rsa Username:docker}
	I0831 16:06:39.657307    5342 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 16:06:39.660355    5342 command_runner.go:130] > NAME=Buildroot
	I0831 16:06:39.660366    5342 command_runner.go:130] > VERSION=2023.02.9-dirty
	I0831 16:06:39.660371    5342 command_runner.go:130] > ID=buildroot
	I0831 16:06:39.660378    5342 command_runner.go:130] > VERSION_ID=2023.02.9
	I0831 16:06:39.660384    5342 command_runner.go:130] > PRETTY_NAME="Buildroot 2023.02.9"
	I0831 16:06:39.660425    5342 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 16:06:39.660438    5342 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 16:06:39.660522    5342 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 16:06:39.660658    5342 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 16:06:39.660664    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 16:06:39.660819    5342 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 16:06:39.668111    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 16:06:39.688004    5342 start.go:296] duration metric: took 62.908509ms for postStartSetup
	I0831 16:06:39.688023    5342 fix.go:56] duration metric: took 37.289479116s for fixHost
	I0831 16:06:39.688053    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHHostname
	I0831 16:06:39.688179    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHPort
	I0831 16:06:39.688261    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHKeyPath
	I0831 16:06:39.688362    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHKeyPath
	I0831 16:06:39.688441    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHUsername
	I0831 16:06:39.688547    5342 main.go:141] libmachine: Using SSH client type: native
	I0831 16:06:39.688684    5342 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1e70ea0] 0x1e73c00 <nil>  [] 0s} 192.169.0.14 22 <nil> <nil>}
	I0831 16:06:39.688692    5342 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 16:06:39.739575    5342 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725145599.854769597
	
	I0831 16:06:39.739584    5342 fix.go:216] guest clock: 1725145599.854769597
	I0831 16:06:39.739590    5342 fix.go:229] Guest: 2024-08-31 16:06:39.854769597 -0700 PDT Remote: 2024-08-31 16:06:39.688029 -0700 PDT m=+119.889236541 (delta=166.740597ms)
	I0831 16:06:39.739599    5342 fix.go:200] guest clock delta is within tolerance: 166.740597ms
	I0831 16:06:39.739606    5342 start.go:83] releasing machines lock for "multinode-957000-m02", held for 37.341082264s
	I0831 16:06:39.739622    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .DriverName
	I0831 16:06:39.739746    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetIP
	I0831 16:06:39.762074    5342 out.go:177] * Found network options:
	I0831 16:06:39.781956    5342 out.go:177]   - NO_PROXY=192.169.0.13
	W0831 16:06:39.803114    5342 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 16:06:39.803153    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .DriverName
	I0831 16:06:39.803977    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .DriverName
	I0831 16:06:39.804268    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .DriverName
	I0831 16:06:39.804405    5342 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 16:06:39.804458    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHHostname
	W0831 16:06:39.804536    5342 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 16:06:39.804652    5342 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 16:06:39.804672    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHHostname
	I0831 16:06:39.804697    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHPort
	I0831 16:06:39.804895    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHPort
	I0831 16:06:39.804927    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHKeyPath
	I0831 16:06:39.805137    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHKeyPath
	I0831 16:06:39.805175    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHUsername
	I0831 16:06:39.805357    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHUsername
	I0831 16:06:39.805402    5342 sshutil.go:53] new ssh client: &{IP:192.169.0.14 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/id_rsa Username:docker}
	I0831 16:06:39.805558    5342 sshutil.go:53] new ssh client: &{IP:192.169.0.14 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/id_rsa Username:docker}
	I0831 16:06:39.834849    5342 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W0831 16:06:39.835026    5342 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 16:06:39.835091    5342 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 16:06:39.879413    5342 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I0831 16:06:39.879577    5342 command_runner.go:139] > /etc/cni/net.d/87-podman-bridge.conflist, 
	I0831 16:06:39.879615    5342 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 16:06:39.879629    5342 start.go:495] detecting cgroup driver to use...
	I0831 16:06:39.879726    5342 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 16:06:39.895464    5342 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I0831 16:06:39.895538    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 16:06:39.904443    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 16:06:39.913235    5342 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 16:06:39.913284    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 16:06:39.922082    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 16:06:39.930963    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 16:06:39.939818    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 16:06:39.948563    5342 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 16:06:39.957750    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 16:06:39.966675    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 16:06:39.975617    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 16:06:39.984508    5342 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 16:06:39.992478    5342 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I0831 16:06:39.992587    5342 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 16:06:40.000688    5342 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 16:06:40.095314    5342 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 16:06:40.114644    5342 start.go:495] detecting cgroup driver to use...
	I0831 16:06:40.114722    5342 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 16:06:40.128324    5342 command_runner.go:130] > # /usr/lib/systemd/system/docker.service
	I0831 16:06:40.128778    5342 command_runner.go:130] > [Unit]
	I0831 16:06:40.128786    5342 command_runner.go:130] > Description=Docker Application Container Engine
	I0831 16:06:40.128795    5342 command_runner.go:130] > Documentation=https://docs.docker.com
	I0831 16:06:40.128801    5342 command_runner.go:130] > After=network.target  minikube-automount.service docker.socket
	I0831 16:06:40.128806    5342 command_runner.go:130] > Requires= minikube-automount.service docker.socket 
	I0831 16:06:40.128813    5342 command_runner.go:130] > StartLimitBurst=3
	I0831 16:06:40.128819    5342 command_runner.go:130] > StartLimitIntervalSec=60
	I0831 16:06:40.128825    5342 command_runner.go:130] > [Service]
	I0831 16:06:40.128829    5342 command_runner.go:130] > Type=notify
	I0831 16:06:40.128832    5342 command_runner.go:130] > Restart=on-failure
	I0831 16:06:40.128836    5342 command_runner.go:130] > Environment=NO_PROXY=192.169.0.13
	I0831 16:06:40.128843    5342 command_runner.go:130] > # This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	I0831 16:06:40.128851    5342 command_runner.go:130] > # The base configuration already specifies an 'ExecStart=...' command. The first directive
	I0831 16:06:40.128859    5342 command_runner.go:130] > # here is to clear out that command inherited from the base configuration. Without this,
	I0831 16:06:40.128869    5342 command_runner.go:130] > # the command from the base configuration and the command specified here are treated as
	I0831 16:06:40.128876    5342 command_runner.go:130] > # a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	I0831 16:06:40.128881    5342 command_runner.go:130] > # will catch this invalid input and refuse to start the service with an error like:
	I0831 16:06:40.128887    5342 command_runner.go:130] > #  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	I0831 16:06:40.128895    5342 command_runner.go:130] > # NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	I0831 16:06:40.128901    5342 command_runner.go:130] > # container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	I0831 16:06:40.128904    5342 command_runner.go:130] > ExecStart=
	I0831 16:06:40.128916    5342 command_runner.go:130] > ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	I0831 16:06:40.128921    5342 command_runner.go:130] > ExecReload=/bin/kill -s HUP $MAINPID
	I0831 16:06:40.128927    5342 command_runner.go:130] > # Having non-zero Limit*s causes performance problems due to accounting overhead
	I0831 16:06:40.128932    5342 command_runner.go:130] > # in the kernel. We recommend using cgroups to do container-local accounting.
	I0831 16:06:40.128936    5342 command_runner.go:130] > LimitNOFILE=infinity
	I0831 16:06:40.128939    5342 command_runner.go:130] > LimitNPROC=infinity
	I0831 16:06:40.128943    5342 command_runner.go:130] > LimitCORE=infinity
	I0831 16:06:40.128948    5342 command_runner.go:130] > # Uncomment TasksMax if your systemd version supports it.
	I0831 16:06:40.128952    5342 command_runner.go:130] > # Only systemd 226 and above support this version.
	I0831 16:06:40.128955    5342 command_runner.go:130] > TasksMax=infinity
	I0831 16:06:40.128959    5342 command_runner.go:130] > TimeoutStartSec=0
	I0831 16:06:40.128965    5342 command_runner.go:130] > # set delegate yes so that systemd does not reset the cgroups of docker containers
	I0831 16:06:40.128968    5342 command_runner.go:130] > Delegate=yes
	I0831 16:06:40.128973    5342 command_runner.go:130] > # kill only the docker process, not all processes in the cgroup
	I0831 16:06:40.128981    5342 command_runner.go:130] > KillMode=process
	I0831 16:06:40.128985    5342 command_runner.go:130] > [Install]
	I0831 16:06:40.128988    5342 command_runner.go:130] > WantedBy=multi-user.target
	I0831 16:06:40.129059    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 16:06:40.143585    5342 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 16:06:40.159860    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 16:06:40.171526    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 16:06:40.182725    5342 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 16:06:40.202277    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 16:06:40.213141    5342 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 16:06:40.227689    5342 command_runner.go:130] > runtime-endpoint: unix:///var/run/cri-dockerd.sock
	I0831 16:06:40.227940    5342 ssh_runner.go:195] Run: which cri-dockerd
	I0831 16:06:40.230571    5342 command_runner.go:130] > /usr/bin/cri-dockerd
	I0831 16:06:40.230752    5342 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 16:06:40.237980    5342 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 16:06:40.251316    5342 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 16:06:40.344822    5342 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 16:06:40.438020    5342 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 16:06:40.438044    5342 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 16:06:40.455070    5342 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 16:06:40.563046    5342 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 16:06:42.843343    5342 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.280262157s)
	I0831 16:06:42.843404    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 16:06:42.854365    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 16:06:42.865429    5342 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 16:06:42.964508    5342 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 16:06:43.060144    5342 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 16:06:43.171199    5342 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 16:06:43.184495    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 16:06:43.195303    5342 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 16:06:43.292027    5342 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 16:06:43.346531    5342 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 16:06:43.346607    5342 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 16:06:43.350885    5342 command_runner.go:130] >   File: /var/run/cri-dockerd.sock
	I0831 16:06:43.350897    5342 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I0831 16:06:43.350902    5342 command_runner.go:130] > Device: 0,22	Inode: 767         Links: 1
	I0831 16:06:43.350914    5342 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: ( 1000/  docker)
	I0831 16:06:43.350920    5342 command_runner.go:130] > Access: 2024-08-31 23:06:43.421044221 +0000
	I0831 16:06:43.350928    5342 command_runner.go:130] > Modify: 2024-08-31 23:06:43.421044221 +0000
	I0831 16:06:43.350933    5342 command_runner.go:130] > Change: 2024-08-31 23:06:43.423044295 +0000
	I0831 16:06:43.350936    5342 command_runner.go:130] >  Birth: -
	I0831 16:06:43.350987    5342 start.go:563] Will wait 60s for crictl version
	I0831 16:06:43.351037    5342 ssh_runner.go:195] Run: which crictl
	I0831 16:06:43.355247    5342 command_runner.go:130] > /usr/bin/crictl
	I0831 16:06:43.355322    5342 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 16:06:43.389595    5342 command_runner.go:130] > Version:  0.1.0
	I0831 16:06:43.389610    5342 command_runner.go:130] > RuntimeName:  docker
	I0831 16:06:43.389637    5342 command_runner.go:130] > RuntimeVersion:  27.2.0
	I0831 16:06:43.389673    5342 command_runner.go:130] > RuntimeApiVersion:  v1
	I0831 16:06:43.390782    5342 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 16:06:43.390849    5342 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 16:06:43.408670    5342 command_runner.go:130] > 27.2.0
	I0831 16:06:43.409527    5342 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 16:06:43.426841    5342 command_runner.go:130] > 27.2.0
	I0831 16:06:43.450044    5342 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 16:06:43.490875    5342 out.go:177]   - env NO_PROXY=192.169.0.13
	I0831 16:06:43.512045    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetIP
	I0831 16:06:43.512422    5342 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 16:06:43.517426    5342 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 16:06:43.527319    5342 mustload.go:65] Loading cluster: multinode-957000
	I0831 16:06:43.527492    5342 config.go:182] Loaded profile config "multinode-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 16:06:43.527720    5342 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:06:43.527735    5342 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:06:43.536424    5342 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53188
	I0831 16:06:43.536791    5342 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:06:43.537116    5342 main.go:141] libmachine: Using API Version  1
	I0831 16:06:43.537129    5342 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:06:43.537359    5342 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:06:43.537467    5342 main.go:141] libmachine: (multinode-957000) Calling .GetState
	I0831 16:06:43.537544    5342 main.go:141] libmachine: (multinode-957000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:06:43.537621    5342 main.go:141] libmachine: (multinode-957000) DBG | hyperkit pid from json: 5355
	I0831 16:06:43.538564    5342 host.go:66] Checking if "multinode-957000" exists ...
	I0831 16:06:43.538810    5342 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:06:43.538826    5342 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:06:43.547412    5342 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53190
	I0831 16:06:43.547746    5342 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:06:43.548111    5342 main.go:141] libmachine: Using API Version  1
	I0831 16:06:43.548126    5342 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:06:43.548317    5342 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:06:43.548433    5342 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:06:43.548522    5342 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000 for IP: 192.169.0.14
	I0831 16:06:43.548528    5342 certs.go:194] generating shared ca certs ...
	I0831 16:06:43.548538    5342 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 16:06:43.548677    5342 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 16:06:43.548735    5342 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 16:06:43.548745    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 16:06:43.548770    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 16:06:43.548794    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 16:06:43.548812    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 16:06:43.548895    5342 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 16:06:43.548935    5342 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 16:06:43.548945    5342 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 16:06:43.548981    5342 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 16:06:43.549015    5342 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 16:06:43.549043    5342 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 16:06:43.549109    5342 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 16:06:43.549147    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 16:06:43.549168    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 16:06:43.549186    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 16:06:43.549214    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 16:06:43.569447    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 16:06:43.589521    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 16:06:43.610235    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 16:06:43.630720    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 16:06:43.650512    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 16:06:43.670170    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 16:06:43.689678    5342 ssh_runner.go:195] Run: openssl version
	I0831 16:06:43.693859    5342 command_runner.go:130] > OpenSSL 1.1.1w  11 Sep 2023
	I0831 16:06:43.693993    5342 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 16:06:43.702468    5342 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 16:06:43.705829    5342 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 16:06:43.705926    5342 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 16:06:43.705966    5342 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 16:06:43.710054    5342 command_runner.go:130] > 51391683
	I0831 16:06:43.710215    5342 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 16:06:43.718612    5342 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 16:06:43.727182    5342 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 16:06:43.730502    5342 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 16:06:43.730590    5342 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 16:06:43.730635    5342 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 16:06:43.734714    5342 command_runner.go:130] > 3ec20f2e
	I0831 16:06:43.734901    5342 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 16:06:43.743385    5342 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 16:06:43.751818    5342 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 16:06:43.755176    5342 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 16:06:43.755299    5342 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 16:06:43.755334    5342 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 16:06:43.759406    5342 command_runner.go:130] > b5213941
	I0831 16:06:43.759593    5342 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 16:06:43.767931    5342 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 16:06:43.771069    5342 command_runner.go:130] ! stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 16:06:43.771089    5342 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 16:06:43.771116    5342 kubeadm.go:934] updating node {m02 192.169.0.14 8443 v1.31.0 docker false true} ...
	I0831 16:06:43.771175    5342 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=multinode-957000-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.14
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:multinode-957000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 16:06:43.771221    5342 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 16:06:43.778551    5342 command_runner.go:130] > kubeadm
	I0831 16:06:43.778560    5342 command_runner.go:130] > kubectl
	I0831 16:06:43.778563    5342 command_runner.go:130] > kubelet
	I0831 16:06:43.778616    5342 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 16:06:43.778671    5342 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system
	I0831 16:06:43.786106    5342 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (319 bytes)
	I0831 16:06:43.799545    5342 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 16:06:43.812904    5342 ssh_runner.go:195] Run: grep 192.169.0.13	control-plane.minikube.internal$ /etc/hosts
	I0831 16:06:43.815847    5342 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.13	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 16:06:43.825355    5342 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 16:06:43.917556    5342 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 16:06:43.932069    5342 host.go:66] Checking if "multinode-957000" exists ...
	I0831 16:06:43.932358    5342 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:06:43.932377    5342 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:06:43.941273    5342 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53192
	I0831 16:06:43.941640    5342 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:06:43.941966    5342 main.go:141] libmachine: Using API Version  1
	I0831 16:06:43.941978    5342 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:06:43.942183    5342 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:06:43.942288    5342 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:06:43.942378    5342 start.go:317] joinCluster: &{Name:multinode-957000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.3
1.0 ClusterName:multinode-957000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.13 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.14 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true} {Name:m03 IP:192.169.0.15 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:f
alse inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOpt
imizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 16:06:43.942460    5342 start.go:330] removing existing worker node "m02" before attempting to rejoin cluster: &{Name:m02 IP:192.169.0.14 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}
	I0831 16:06:43.942480    5342 host.go:66] Checking if "multinode-957000-m02" exists ...
	I0831 16:06:43.942734    5342 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:06:43.942752    5342 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:06:43.951581    5342 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53194
	I0831 16:06:43.951939    5342 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:06:43.952271    5342 main.go:141] libmachine: Using API Version  1
	I0831 16:06:43.952283    5342 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:06:43.952483    5342 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:06:43.952592    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .DriverName
	I0831 16:06:43.952674    5342 mustload.go:65] Loading cluster: multinode-957000
	I0831 16:06:43.952846    5342 config.go:182] Loaded profile config "multinode-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 16:06:43.953062    5342 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:06:43.953079    5342 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:06:43.961793    5342 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53196
	I0831 16:06:43.962140    5342 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:06:43.962511    5342 main.go:141] libmachine: Using API Version  1
	I0831 16:06:43.962530    5342 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:06:43.962724    5342 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:06:43.962833    5342 main.go:141] libmachine: (multinode-957000) Calling .GetState
	I0831 16:06:43.962911    5342 main.go:141] libmachine: (multinode-957000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:06:43.962984    5342 main.go:141] libmachine: (multinode-957000) DBG | hyperkit pid from json: 5355
	I0831 16:06:43.963947    5342 host.go:66] Checking if "multinode-957000" exists ...
	I0831 16:06:43.964196    5342 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:06:43.964223    5342 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:06:43.972984    5342 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53198
	I0831 16:06:43.973339    5342 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:06:43.973674    5342 main.go:141] libmachine: Using API Version  1
	I0831 16:06:43.973685    5342 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:06:43.973878    5342 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:06:43.973988    5342 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:06:43.974077    5342 api_server.go:166] Checking apiserver status ...
	I0831 16:06:43.974131    5342 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 16:06:43.974142    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:06:43.974224    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:06:43.974310    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:06:43.974393    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:06:43.974475    5342 sshutil.go:53] new ssh client: &{IP:192.169.0.13 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/id_rsa Username:docker}
	I0831 16:06:44.015721    5342 command_runner.go:130] > 1696
	I0831 16:06:44.015829    5342 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1696/cgroup
	W0831 16:06:44.023227    5342 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1696/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 16:06:44.023286    5342 ssh_runner.go:195] Run: ls
	I0831 16:06:44.026431    5342 api_server.go:253] Checking apiserver healthz at https://192.169.0.13:8443/healthz ...
	I0831 16:06:44.029564    5342 api_server.go:279] https://192.169.0.13:8443/healthz returned 200:
	ok
	I0831 16:06:44.029627    5342 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl drain multinode-957000-m02 --force --grace-period=1 --skip-wait-for-delete-timeout=1 --disable-eviction --ignore-daemonsets --delete-emptydir-data
	I0831 16:06:44.120348    5342 command_runner.go:130] > node/multinode-957000-m02 cordoned
	I0831 16:06:47.141716    5342 command_runner.go:130] > pod "busybox-7dff88458-rjh4x" has DeletionTimestamp older than 1 seconds, skipping
	I0831 16:06:47.141736    5342 command_runner.go:130] > node/multinode-957000-m02 drained
	I0831 16:06:47.143459    5342 command_runner.go:130] ! Warning: ignoring DaemonSet-managed Pods: kube-system/kindnet-gkhfh, kube-system/kube-proxy-cplv4
	I0831 16:06:47.143521    5342 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl drain multinode-957000-m02 --force --grace-period=1 --skip-wait-for-delete-timeout=1 --disable-eviction --ignore-daemonsets --delete-emptydir-data: (3.113860439s)
	I0831 16:06:47.143533    5342 node.go:128] successfully drained node "multinode-957000-m02"
	I0831 16:06:47.143555    5342 ssh_runner.go:195] Run: /bin/bash -c "KUBECONFIG=/var/lib/minikube/kubeconfig sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm reset --force --ignore-preflight-errors=all --cri-socket=unix:///var/run/cri-dockerd.sock"
	I0831 16:06:47.143574    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHHostname
	I0831 16:06:47.143721    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHPort
	I0831 16:06:47.143804    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHKeyPath
	I0831 16:06:47.143890    5342 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHUsername
	I0831 16:06:47.143984    5342 sshutil.go:53] new ssh client: &{IP:192.169.0.14 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/id_rsa Username:docker}
	I0831 16:06:47.225311    5342 command_runner.go:130] ! W0831 23:06:47.346243    1258 removeetcdmember.go:106] [reset] No kubeadm config, using etcd pod spec to get data directory
	I0831 16:06:47.270061    5342 command_runner.go:130] ! W0831 23:06:47.389367    1258 cleanupnode.go:105] [reset] Failed to remove containers: failed to stop running pod d51d739754d2a9c324a0cdede65c4021388c98a2270b84cf7fbf28b19d597e53: rpc error: code = Unknown desc = networkPlugin cni failed to teardown pod "busybox-7dff88458-rjh4x_default" network: cni config uninitialized
	I0831 16:06:47.271786    5342 command_runner.go:130] > [preflight] Running pre-flight checks
	I0831 16:06:47.271800    5342 command_runner.go:130] > [reset] Deleted contents of the etcd data directory: /var/lib/etcd
	I0831 16:06:47.271805    5342 command_runner.go:130] > [reset] Stopping the kubelet service
	I0831 16:06:47.271815    5342 command_runner.go:130] > [reset] Unmounting mounted directories in "/var/lib/kubelet"
	I0831 16:06:47.271823    5342 command_runner.go:130] > [reset] Deleting contents of directories: [/etc/kubernetes/manifests /var/lib/kubelet /etc/kubernetes/pki]
	I0831 16:06:47.271833    5342 command_runner.go:130] > [reset] Deleting files: [/etc/kubernetes/admin.conf /etc/kubernetes/super-admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/bootstrap-kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf]
	I0831 16:06:47.271839    5342 command_runner.go:130] > The reset process does not clean CNI configuration. To do so, you must remove /etc/cni/net.d
	I0831 16:06:47.271860    5342 command_runner.go:130] > The reset process does not reset or clean up iptables rules or IPVS tables.
	I0831 16:06:47.271874    5342 command_runner.go:130] > If you wish to reset iptables, you must do so manually by using the "iptables" command.
	I0831 16:06:47.271883    5342 command_runner.go:130] > If your cluster was setup to utilize IPVS, run ipvsadm --clear (or similar)
	I0831 16:06:47.271900    5342 command_runner.go:130] > to reset your system's IPVS tables.
	I0831 16:06:47.271908    5342 command_runner.go:130] > The reset process does not clean your kubeconfig files and you must remove them manually.
	I0831 16:06:47.271917    5342 command_runner.go:130] > Please, check the contents of the $HOME/.kube/config file.
	I0831 16:06:47.271930    5342 node.go:155] successfully reset node "multinode-957000-m02"
	I0831 16:06:47.272213    5342 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 16:06:47.272456    5342 kapi.go:59] client config for multinode-957000: &rest.Config{Host:"https://192.169.0.13:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProt
os:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x352cc00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0831 16:06:47.272728    5342 request.go:1351] Request Body: {"kind":"DeleteOptions","apiVersion":"v1"}
	I0831 16:06:47.272760    5342 round_trippers.go:463] DELETE https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:47.272765    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:47.272775    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:47.272779    5342 round_trippers.go:473]     Content-Type: application/json
	I0831 16:06:47.272782    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:47.275714    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:47.275728    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:47.275736    5342 round_trippers.go:580]     Audit-Id: ff368846-064a-48a3-ba47-77fe79a5de70
	I0831 16:06:47.275740    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:47.275744    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:47.275748    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:47.275750    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:47.275753    5342 round_trippers.go:580]     Content-Length: 171
	I0831 16:06:47.275755    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:47 GMT
	I0831 16:06:47.275773    5342 request.go:1351] Response Body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Success","details":{"name":"multinode-957000-m02","kind":"nodes","uid":"80356a3f-91f2-42b6-b267-2e41c24b1477"}}
	I0831 16:06:47.275800    5342 node.go:180] successfully deleted node "multinode-957000-m02"
	I0831 16:06:47.275808    5342 start.go:334] successfully removed existing worker node "m02" from cluster: &{Name:m02 IP:192.169.0.14 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}
	I0831 16:06:47.275826    5342 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm token create --print-join-command --ttl=0"
	I0831 16:06:47.275845    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:06:47.276061    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:06:47.276172    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:06:47.276305    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:06:47.276427    5342 sshutil.go:53] new ssh client: &{IP:192.169.0.13 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/id_rsa Username:docker}
	I0831 16:06:47.356094    5342 command_runner.go:130] > kubeadm join control-plane.minikube.internal:8443 --token 0yu40h.bdlsc84zuvzkv1q0 --discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 
	I0831 16:06:47.357284    5342 start.go:343] trying to join worker node "m02" to cluster: &{Name:m02 IP:192.169.0.14 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}
	I0831 16:06:47.357306    5342 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm join control-plane.minikube.internal:8443 --token 0yu40h.bdlsc84zuvzkv1q0 --discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=multinode-957000-m02"
	I0831 16:06:47.391409    5342 command_runner.go:130] > [preflight] Running pre-flight checks
	I0831 16:06:47.467352    5342 command_runner.go:130] > [preflight] Reading configuration from the cluster...
	I0831 16:06:47.467369    5342 command_runner.go:130] > [preflight] FYI: You can look at this config file with 'kubectl -n kube-system get cm kubeadm-config -o yaml'
	I0831 16:06:47.497898    5342 command_runner.go:130] > [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0831 16:06:47.497914    5342 command_runner.go:130] > [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0831 16:06:47.497919    5342 command_runner.go:130] > [kubelet-start] Starting the kubelet
	I0831 16:06:47.620575    5342 command_runner.go:130] > [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I0831 16:06:48.622093    5342 command_runner.go:130] > [kubelet-check] The kubelet is healthy after 1.002178536s
	I0831 16:06:48.622111    5342 command_runner.go:130] > [kubelet-start] Waiting for the kubelet to perform the TLS Bootstrap
	I0831 16:06:49.143605    5342 command_runner.go:130] > This node has joined the cluster:
	I0831 16:06:49.143621    5342 command_runner.go:130] > * Certificate signing request was sent to apiserver and a response was received.
	I0831 16:06:49.143626    5342 command_runner.go:130] > * The Kubelet was informed of the new secure connection details.
	I0831 16:06:49.143632    5342 command_runner.go:130] > Run 'kubectl get nodes' on the control-plane to see this node join the cluster.
	I0831 16:06:49.145590    5342 command_runner.go:130] ! 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0831 16:06:49.145702    5342 ssh_runner.go:235] Completed: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm join control-plane.minikube.internal:8443 --token 0yu40h.bdlsc84zuvzkv1q0 --discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=multinode-957000-m02": (1.788363517s)
	I0831 16:06:49.145716    5342 ssh_runner.go:195] Run: /bin/bash -c "sudo systemctl daemon-reload && sudo systemctl enable kubelet && sudo systemctl start kubelet"
	I0831 16:06:49.362434    5342 command_runner.go:130] ! Created symlink /etc/systemd/system/multi-user.target.wants/kubelet.service → /usr/lib/systemd/system/kubelet.service.
	I0831 16:06:49.362580    5342 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes multinode-957000-m02 minikube.k8s.io/updated_at=2024_08_31T16_06_49_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2 minikube.k8s.io/name=multinode-957000 minikube.k8s.io/primary=false
	I0831 16:06:49.434153    5342 command_runner.go:130] > node/multinode-957000-m02 labeled
	I0831 16:06:49.435314    5342 start.go:319] duration metric: took 5.492905293s to joinCluster
	I0831 16:06:49.435356    5342 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.169.0.14 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}
	I0831 16:06:49.435588    5342 config.go:182] Loaded profile config "multinode-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 16:06:49.458962    5342 out.go:177] * Verifying Kubernetes components...
	I0831 16:06:49.517294    5342 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 16:06:49.624491    5342 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 16:06:49.636866    5342 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 16:06:49.637056    5342 kapi.go:59] client config for multinode-957000: &rest.Config{Host:"https://192.169.0.13:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProt
os:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x352cc00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0831 16:06:49.637250    5342 node_ready.go:35] waiting up to 6m0s for node "multinode-957000-m02" to be "Ready" ...
	I0831 16:06:49.637291    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:49.637296    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:49.637302    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:49.637307    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:49.638849    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:06:49.638858    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:49.638863    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:49.638878    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:49.638885    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:49.638887    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:49 GMT
	I0831 16:06:49.638890    5342 round_trippers.go:580]     Audit-Id: 129a5b7a-dacc-48da-80b5-14b7679de0da
	I0831 16:06:49.638892    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:49.639003    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"985","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":
{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}," [truncated 3557 chars]
	I0831 16:06:50.138748    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:50.138772    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:50.138784    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:50.138790    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:50.141462    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:50.141480    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:50.141487    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:50.141492    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:50.141495    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:50.141500    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:50.141504    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:50 GMT
	I0831 16:06:50.141508    5342 round_trippers.go:580]     Audit-Id: fd05e369-c60c-4000-8da4-67a3fb2c1258
	I0831 16:06:50.141758    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"987","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-atta [truncated 3666 chars]
	I0831 16:06:50.637598    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:50.637618    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:50.637630    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:50.637636    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:50.640317    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:50.640331    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:50.640339    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:50.640343    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:50.640350    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:50.640356    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:50 GMT
	I0831 16:06:50.640361    5342 round_trippers.go:580]     Audit-Id: a7a344fb-7008-4f5b-b727-df24e9333bde
	I0831 16:06:50.640365    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:50.640661    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"987","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-atta [truncated 3666 chars]
	I0831 16:06:51.138695    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:51.138709    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:51.138716    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:51.138719    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:51.140517    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:06:51.140532    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:51.140540    5342 round_trippers.go:580]     Audit-Id: ccc7ccfe-a397-4991-91b0-befabf356fcd
	I0831 16:06:51.140544    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:51.140548    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:51.140551    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:51.140555    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:51.140557    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:51 GMT
	I0831 16:06:51.140781    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"987","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-atta [truncated 3666 chars]
	I0831 16:06:51.637540    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:51.637553    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:51.637559    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:51.637562    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:51.639055    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:06:51.639067    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:51.639073    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:51.639076    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:51.639079    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:51.639082    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:51.639085    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:51 GMT
	I0831 16:06:51.639088    5342 round_trippers.go:580]     Audit-Id: 3cbfa4d8-f512-4fbb-a1bd-f6c665d82f45
	I0831 16:06:51.639149    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"987","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-atta [truncated 3666 chars]
	I0831 16:06:51.639323    5342 node_ready.go:53] node "multinode-957000-m02" has status "Ready":"False"
	I0831 16:06:52.138504    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:52.138524    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:52.138535    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:52.138542    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:52.140916    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:52.140929    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:52.140936    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:52.140942    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:52.140946    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:52.140949    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:52 GMT
	I0831 16:06:52.140953    5342 round_trippers.go:580]     Audit-Id: 4aed1cbe-239c-49fb-8723-e30ec54953a2
	I0831 16:06:52.140956    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:52.141176    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"987","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-atta [truncated 3666 chars]
	I0831 16:06:52.638495    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:52.638518    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:52.638531    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:52.638539    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:52.640847    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:52.640863    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:52.640870    5342 round_trippers.go:580]     Audit-Id: 7e4c000d-0a04-428f-a1ad-dac4942d45b4
	I0831 16:06:52.640874    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:52.640877    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:52.640892    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:52.640898    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:52.640902    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:52 GMT
	I0831 16:06:52.641324    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"987","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-atta [truncated 3666 chars]
	I0831 16:06:53.138062    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:53.138087    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:53.138099    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:53.138105    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:53.140686    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:53.140705    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:53.140713    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:53.140718    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:53 GMT
	I0831 16:06:53.140722    5342 round_trippers.go:580]     Audit-Id: a7476379-e2aa-4f38-9429-9d8fd9766769
	I0831 16:06:53.140726    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:53.140730    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:53.140734    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:53.140921    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"987","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-atta [truncated 3666 chars]
	I0831 16:06:53.637984    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:53.638000    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:53.638008    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:53.638012    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:53.640054    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:53.640064    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:53.640070    5342 round_trippers.go:580]     Audit-Id: 68e4053e-a508-4a6a-951a-8e4ebe9402d6
	I0831 16:06:53.640073    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:53.640076    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:53.640079    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:53.640084    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:53.640086    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:53 GMT
	I0831 16:06:53.640293    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"987","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-atta [truncated 3666 chars]
	I0831 16:06:53.640467    5342 node_ready.go:53] node "multinode-957000-m02" has status "Ready":"False"
	I0831 16:06:54.138811    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:54.138835    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:54.138847    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:54.138853    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:54.141183    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:54.141197    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:54.141204    5342 round_trippers.go:580]     Audit-Id: e6239643-c76c-446b-b11d-cbf49fa0bdfd
	I0831 16:06:54.141231    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:54.141239    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:54.141244    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:54.141248    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:54.141253    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:54 GMT
	I0831 16:06:54.141351    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"987","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-atta [truncated 3666 chars]
	I0831 16:06:54.637786    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:54.637811    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:54.637823    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:54.637832    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:54.640664    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:54.640681    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:54.640688    5342 round_trippers.go:580]     Audit-Id: ca85879b-50d2-416b-9560-0c6d89e7f19c
	I0831 16:06:54.640692    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:54.640695    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:54.640698    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:54.640702    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:54.640726    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:54 GMT
	I0831 16:06:54.641070    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"987","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-atta [truncated 3666 chars]
	I0831 16:06:55.138396    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:55.138422    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:55.138434    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:55.138442    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:55.141238    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:55.141254    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:55.141262    5342 round_trippers.go:580]     Audit-Id: 4864b999-f201-4b38-872a-ad26ec6ddecc
	I0831 16:06:55.141266    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:55.141270    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:55.141273    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:55.141277    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:55.141311    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:55 GMT
	I0831 16:06:55.141387    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"987","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-atta [truncated 3666 chars]
	I0831 16:06:55.638699    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:55.638726    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:55.638735    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:55.638739    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:55.640880    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:55.640890    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:55.640895    5342 round_trippers.go:580]     Audit-Id: f4a01a80-6812-4b2a-9d85-5d89ceccf55c
	I0831 16:06:55.640899    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:55.640902    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:55.640905    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:55.640909    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:55.640913    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:55 GMT
	I0831 16:06:55.640963    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"987","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-atta [truncated 3666 chars]
	I0831 16:06:55.641127    5342 node_ready.go:53] node "multinode-957000-m02" has status "Ready":"False"
	I0831 16:06:56.138275    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:56.138297    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:56.138307    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:56.138313    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:56.141138    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:56.141153    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:56.141161    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:56.141169    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:56 GMT
	I0831 16:06:56.141174    5342 round_trippers.go:580]     Audit-Id: 1c6bd83f-9e5b-4a16-97c9-6479fe55fa96
	I0831 16:06:56.141180    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:56.141183    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:56.141187    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:56.141299    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"987","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-atta [truncated 3666 chars]
	I0831 16:06:56.637973    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:56.638052    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:56.638068    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:56.638076    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:56.640644    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:56.640665    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:56.640675    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:56.640682    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:56.640688    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:56.640693    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:56.640699    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:56 GMT
	I0831 16:06:56.640705    5342 round_trippers.go:580]     Audit-Id: 9e93b8ec-e022-48f0-84a0-d0238248609c
	I0831 16:06:56.640913    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"987","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-atta [truncated 3666 chars]
	I0831 16:06:57.137751    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:57.137773    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:57.137785    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:57.137792    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:57.140473    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:57.140486    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:57.140493    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:57.140499    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:57.140503    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:57.140506    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:57.140513    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:57 GMT
	I0831 16:06:57.140517    5342 round_trippers.go:580]     Audit-Id: c13ae97e-15f6-4a4b-844c-40d1e9577525
	I0831 16:06:57.140781    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"987","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-atta [truncated 3666 chars]
	I0831 16:06:57.637625    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:57.637643    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:57.637652    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:57.637657    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:57.639836    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:57.639858    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:57.639865    5342 round_trippers.go:580]     Audit-Id: 76ad29c0-40cb-46e3-b371-b69dbca9bb69
	I0831 16:06:57.639868    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:57.639871    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:57.639875    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:57.639878    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:57.639888    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:57 GMT
	I0831 16:06:57.640357    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"987","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-atta [truncated 3666 chars]
	I0831 16:06:58.139400    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:58.139424    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:58.139436    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:58.139445    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:58.142203    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:58.142222    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:58.142230    5342 round_trippers.go:580]     Audit-Id: c50dbf74-f293-4c02-8c7b-179f8d2ed112
	I0831 16:06:58.142235    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:58.142239    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:58.142252    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:58.142257    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:58.142260    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:58 GMT
	I0831 16:06:58.142393    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"987","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-atta [truncated 3666 chars]
	I0831 16:06:58.142620    5342 node_ready.go:53] node "multinode-957000-m02" has status "Ready":"False"
	I0831 16:06:58.638722    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:58.638745    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:58.638757    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:58.638763    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:58.641448    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:58.641469    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:58.641476    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:58.641480    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:58 GMT
	I0831 16:06:58.641484    5342 round_trippers.go:580]     Audit-Id: 036b76d5-2d4d-4fc5-86e9-11c83b09275a
	I0831 16:06:58.641490    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:58.641493    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:58.641496    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:58.641574    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"987","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-atta [truncated 3666 chars]
	I0831 16:06:59.138864    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:59.138888    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:59.138900    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:59.138907    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:59.141586    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:06:59.141600    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:59.141607    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:59.141612    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:59.141615    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:59.141618    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:59 GMT
	I0831 16:06:59.141654    5342 round_trippers.go:580]     Audit-Id: 9cae6ec3-15a8-4b10-9afd-c57bbd79a74e
	I0831 16:06:59.141660    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:59.141758    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"1015","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{
"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-att [truncated 4059 chars]
	I0831 16:06:59.638966    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:06:59.638992    5342 round_trippers.go:469] Request Headers:
	I0831 16:06:59.639005    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:06:59.639011    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:06:59.642483    5342 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 16:06:59.642499    5342 round_trippers.go:577] Response Headers:
	I0831 16:06:59.642507    5342 round_trippers.go:580]     Audit-Id: f0dd2010-5307-4b60-bb85-0e37c04c0ea9
	I0831 16:06:59.642511    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:06:59.642516    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:06:59.642519    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:06:59.642523    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:06:59.642526    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:06:59 GMT
	I0831 16:06:59.642828    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"1015","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{
"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-att [truncated 4059 chars]
	I0831 16:07:00.137711    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:07:00.137733    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:00.137744    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:00.137753    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:00.140540    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:00.140556    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:00.140563    5342 round_trippers.go:580]     Audit-Id: 6e2679ab-e84c-41f9-b676-e941863adcf9
	I0831 16:07:00.140568    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:00.140572    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:00.140580    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:00.140584    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:00.140587    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:00 GMT
	I0831 16:07:00.140751    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"1015","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{
"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-att [truncated 4059 chars]
	I0831 16:07:00.638635    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:07:00.638663    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:00.638674    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:00.638679    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:00.641324    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:00.641339    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:00.641346    5342 round_trippers.go:580]     Audit-Id: b9fd8221-7a6b-4191-aaa0-f86b3540f8cb
	I0831 16:07:00.641350    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:00.641354    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:00.641357    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:00.641361    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:00.641364    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:00 GMT
	I0831 16:07:00.641458    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"1015","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{
"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-att [truncated 4059 chars]
	I0831 16:07:00.641702    5342 node_ready.go:53] node "multinode-957000-m02" has status "Ready":"False"
	I0831 16:07:01.137844    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:07:01.137865    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:01.137877    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:01.137882    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:01.140327    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:01.140340    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:01.140350    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:01.140354    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:01 GMT
	I0831 16:07:01.140361    5342 round_trippers.go:580]     Audit-Id: a779d601-7f30-4c88-bc78-0d2497c0e790
	I0831 16:07:01.140366    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:01.140371    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:01.140379    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:01.140582    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"1015","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{
"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-att [truncated 4059 chars]
	I0831 16:07:01.638602    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:07:01.638630    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:01.638642    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:01.638648    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:01.641321    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:01.641341    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:01.641348    5342 round_trippers.go:580]     Audit-Id: bf8b54b7-de15-4663-a3f8-7f3a3a2be253
	I0831 16:07:01.641359    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:01.641363    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:01.641373    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:01.641378    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:01.641382    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:01 GMT
	I0831 16:07:01.641562    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"1015","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{
"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-att [truncated 4059 chars]
	I0831 16:07:02.137605    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:07:02.137630    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:02.137642    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:02.137651    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:02.140258    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:02.140273    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:02.140281    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:02.140287    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:02.140293    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:02.140297    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:02.140300    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:02 GMT
	I0831 16:07:02.140304    5342 round_trippers.go:580]     Audit-Id: 8a6fc20e-0452-46ac-87b7-0d3e5149a8d2
	I0831 16:07:02.140702    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"1015","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{
"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-att [truncated 4059 chars]
	I0831 16:07:02.638609    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:07:02.638637    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:02.638649    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:02.638656    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:02.641439    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:02.641457    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:02.641465    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:02.641469    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:02.641480    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:02.641483    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:02.641492    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:02 GMT
	I0831 16:07:02.641496    5342 round_trippers.go:580]     Audit-Id: 14ec5c87-0f0d-478a-8004-ce8204f51e9a
	I0831 16:07:02.641563    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"1015","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{
"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-att [truncated 4059 chars]
	I0831 16:07:02.641801    5342 node_ready.go:53] node "multinode-957000-m02" has status "Ready":"False"
	I0831 16:07:03.138662    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:07:03.138684    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:03.138696    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:03.138703    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:03.141362    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:03.141377    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:03.141385    5342 round_trippers.go:580]     Audit-Id: 25541465-c271-4bed-9b3a-a465ea8a0a4d
	I0831 16:07:03.141390    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:03.141416    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:03.141426    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:03.141432    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:03.141437    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:03 GMT
	I0831 16:07:03.141518    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"1015","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{
"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-att [truncated 4059 chars]
	I0831 16:07:03.637935    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:07:03.637956    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:03.637982    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:03.637987    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:03.639569    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:07:03.639580    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:03.639585    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:03.639588    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:03 GMT
	I0831 16:07:03.639590    5342 round_trippers.go:580]     Audit-Id: 1b1731b7-c535-4a9c-88e0-8053a0c40151
	I0831 16:07:03.639592    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:03.639594    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:03.639597    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:03.639666    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"1023","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{
"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-att [truncated 3925 chars]
	I0831 16:07:03.639843    5342 node_ready.go:49] node "multinode-957000-m02" has status "Ready":"True"
	I0831 16:07:03.639852    5342 node_ready.go:38] duration metric: took 14.002510104s for node "multinode-957000-m02" to be "Ready" ...
	I0831 16:07:03.639858    5342 pod_ready.go:36] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 16:07:03.639893    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods
	I0831 16:07:03.639899    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:03.639904    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:03.639908    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:03.643291    5342 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 16:07:03.643300    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:03.643305    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:03.643308    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:03.643332    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:03 GMT
	I0831 16:07:03.643343    5342 round_trippers.go:580]     Audit-Id: d8b53fd4-6e8a-4c83-ab4c-d009d423c5d9
	I0831 16:07:03.643349    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:03.643355    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:03.643939    5342 request.go:1351] Response Body: {"kind":"PodList","apiVersion":"v1","metadata":{"resourceVersion":"1025"},"items":[{"metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"892","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"
f:preferredDuringSchedulingIgnoredDuringExecution":{}}},"f:containers": [truncated 89363 chars]
	I0831 16:07:03.645914    5342 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-q4s6r" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:03.645964    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:07:03.645969    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:03.645975    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:03.645979    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:03.647398    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:07:03.647407    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:03.647425    5342 round_trippers.go:580]     Audit-Id: fb9a0550-37a5-4809-8b56-4ec549557a08
	I0831 16:07:03.647432    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:03.647439    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:03.647444    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:03.647448    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:03.647454    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:03 GMT
	I0831 16:07:03.647623    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"892","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7039 chars]
	I0831 16:07:03.647883    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:07:03.647890    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:03.647898    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:03.647903    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:03.650451    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:03.650459    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:03.650467    5342 round_trippers.go:580]     Audit-Id: 8a6e3f33-6d31-4308-a012-8afaaaccf0cc
	I0831 16:07:03.650470    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:03.650473    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:03.650476    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:03.650479    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:03.650482    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:03 GMT
	I0831 16:07:03.650823    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:07:03.651011    5342 pod_ready.go:93] pod "coredns-6f6b679f8f-q4s6r" in "kube-system" namespace has status "Ready":"True"
	I0831 16:07:03.651020    5342 pod_ready.go:82] duration metric: took 5.095338ms for pod "coredns-6f6b679f8f-q4s6r" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:03.651026    5342 pod_ready.go:79] waiting up to 6m0s for pod "etcd-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:03.651064    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/etcd-multinode-957000
	I0831 16:07:03.651070    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:03.651075    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:03.651079    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:03.652423    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:07:03.652432    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:03.652437    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:03.652440    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:03.652443    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:03 GMT
	I0831 16:07:03.652446    5342 round_trippers.go:580]     Audit-Id: 65e38a83-61f1-475e-a05c-ecfbc6338ec9
	I0831 16:07:03.652448    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:03.652452    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:03.652650    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"etcd-multinode-957000","namespace":"kube-system","uid":"b4833809-a14f-49f4-b877-9f7e4be0bd39","resourceVersion":"857","creationTimestamp":"2024-08-31T22:57:31Z","labels":{"component":"etcd","tier":"control-plane"},"annotations":{"kubeadm.kubernetes.io/etcd.advertise-client-urls":"https://192.169.0.13:2379","kubernetes.io/config.hash":"7ee006dc216d695a2fa4355a2abea57a","kubernetes.io/config.mirror":"7ee006dc216d695a2fa4355a2abea57a","kubernetes.io/config.seen":"2024-08-31T22:57:31.349647295Z","kubernetes.io/config.source":"file"},"ownerReferences":[{"apiVersion":"v1","kind":"Node","name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","controller":true}],"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:31Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:annotations":{".":{},"f:kubeadm.kubernetes.io/etcd.advertise-cl
ient-urls":{},"f:kubernetes.io/config.hash":{},"f:kubernetes.io/config. [truncated 6663 chars]
	I0831 16:07:03.652888    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:07:03.652894    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:03.652899    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:03.652904    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:03.653916    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:07:03.653923    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:03.653928    5342 round_trippers.go:580]     Audit-Id: b32a87c9-be47-4b7a-aba0-1d6449992c4d
	I0831 16:07:03.653933    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:03.653937    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:03.653943    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:03.653946    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:03.653949    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:03 GMT
	I0831 16:07:03.654029    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:07:03.654201    5342 pod_ready.go:93] pod "etcd-multinode-957000" in "kube-system" namespace has status "Ready":"True"
	I0831 16:07:03.654209    5342 pod_ready.go:82] duration metric: took 3.178273ms for pod "etcd-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:03.654219    5342 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:03.654255    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-multinode-957000
	I0831 16:07:03.654263    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:03.654268    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:03.654272    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:03.655245    5342 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 16:07:03.655251    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:03.655256    5342 round_trippers.go:580]     Audit-Id: c94374bc-66a6-4342-8803-2a9887dfe83b
	I0831 16:07:03.655259    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:03.655287    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:03.655305    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:03.655314    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:03.655322    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:03 GMT
	I0831 16:07:03.655430    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-apiserver-multinode-957000","namespace":"kube-system","uid":"e549c883-0eb6-43a1-be40-c8d2f3a9468e","resourceVersion":"862","creationTimestamp":"2024-08-31T22:57:31Z","labels":{"component":"kube-apiserver","tier":"control-plane"},"annotations":{"kubeadm.kubernetes.io/kube-apiserver.advertise-address.endpoint":"192.169.0.13:8443","kubernetes.io/config.hash":"5db461e18c39888a5ab16fd535bfcb2e","kubernetes.io/config.mirror":"5db461e18c39888a5ab16fd535bfcb2e","kubernetes.io/config.seen":"2024-08-31T22:57:31.349647948Z","kubernetes.io/config.source":"file"},"ownerReferences":[{"apiVersion":"v1","kind":"Node","name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","controller":true}],"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:31Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:annotations":{".":{},"f:kubeadm.kube
rnetes.io/kube-apiserver.advertise-address.endpoint":{},"f:kubernetes.i [truncated 7891 chars]
	I0831 16:07:03.655665    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:07:03.655671    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:03.655677    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:03.655681    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:03.656621    5342 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 16:07:03.656630    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:03.656640    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:03.656644    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:03 GMT
	I0831 16:07:03.656646    5342 round_trippers.go:580]     Audit-Id: 9aad776a-1183-4cec-b8e8-b044f72b3d3e
	I0831 16:07:03.656649    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:03.656652    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:03.656655    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:03.656801    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:07:03.656967    5342 pod_ready.go:93] pod "kube-apiserver-multinode-957000" in "kube-system" namespace has status "Ready":"True"
	I0831 16:07:03.656975    5342 pod_ready.go:82] duration metric: took 2.751016ms for pod "kube-apiserver-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:03.656980    5342 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:03.657011    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-multinode-957000
	I0831 16:07:03.657016    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:03.657021    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:03.657025    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:03.657963    5342 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 16:07:03.657972    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:03.657977    5342 round_trippers.go:580]     Audit-Id: 27e1fd0b-e9de-488d-82ef-5f9529663c49
	I0831 16:07:03.657981    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:03.657983    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:03.657989    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:03.657993    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:03.657996    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:03 GMT
	I0831 16:07:03.658172    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-controller-manager-multinode-957000","namespace":"kube-system","uid":"8a82b721-75a3-4460-b9eb-bfc4db35f20e","resourceVersion":"859","creationTimestamp":"2024-08-31T22:57:31Z","labels":{"component":"kube-controller-manager","tier":"control-plane"},"annotations":{"kubernetes.io/config.hash":"9edb08d8378ca77b90e86ed290d828c5","kubernetes.io/config.mirror":"9edb08d8378ca77b90e86ed290d828c5","kubernetes.io/config.seen":"2024-08-31T22:57:31.349643093Z","kubernetes.io/config.source":"file"},"ownerReferences":[{"apiVersion":"v1","kind":"Node","name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","controller":true}],"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:31Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:annotations":{".":{},"f:kubernetes.io/config.hash":{},"f:kubernetes.io/config.mirror":{},"f:kubernetes.i
o/config.seen":{},"f:kubernetes.io/config.source":{}},"f:labels":{".":{ [truncated 7464 chars]
	I0831 16:07:03.658431    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:07:03.658438    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:03.658443    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:03.658447    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:03.659469    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:07:03.659477    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:03.659484    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:03.659488    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:03.659494    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:03.659497    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:03 GMT
	I0831 16:07:03.659501    5342 round_trippers.go:580]     Audit-Id: 3cee7f12-72be-4b48-812e-8626ba16e480
	I0831 16:07:03.659509    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:03.659661    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:07:03.659835    5342 pod_ready.go:93] pod "kube-controller-manager-multinode-957000" in "kube-system" namespace has status "Ready":"True"
	I0831 16:07:03.659843    5342 pod_ready.go:82] duration metric: took 2.856474ms for pod "kube-controller-manager-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:03.659851    5342 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-cplv4" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:03.838545    5342 request.go:632] Waited for 178.600663ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cplv4
	I0831 16:07:03.838629    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cplv4
	I0831 16:07:03.838640    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:03.838651    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:03.838660    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:03.841355    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:03.841370    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:03.841377    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:04 GMT
	I0831 16:07:03.841382    5342 round_trippers.go:580]     Audit-Id: 51add2d4-8355-4ecb-ac78-58521c3ea037
	I0831 16:07:03.841408    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:03.841416    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:03.841419    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:03.841426    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:03.841558    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-proxy-cplv4","generateName":"kube-proxy-","namespace":"kube-system","uid":"56ad32e2-f2ba-4fa5-b093-790a5205b4f2","resourceVersion":"1002","creationTimestamp":"2024-08-31T22:58:18Z","labels":{"controller-revision-hash":"5976bc5f75","k8s-app":"kube-proxy","pod-template-generation":"1"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"DaemonSet","name":"kube-proxy","uid":"7b2d5815-fd80-401f-9040-ee043a6144ec","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:58:18Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:controller-revision-hash":{},"f:k8s-app":{},"f:pod-template-generation":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"7b2d5815-fd80-401f-9040-ee043a6144ec\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:nodeAffinity":{".":{},"f:
requiredDuringSchedulingIgnoredDuringExecution":{}}},"f:containers":{"k [truncated 6198 chars]
	I0831 16:07:04.040074    5342 request.go:632] Waited for 198.108419ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:07:04.040124    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:07:04.040132    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:04.040143    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:04.040157    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:04.042542    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:04.042586    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:04.042595    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:04.042599    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:04.042603    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:04.042608    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:04.042617    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:04 GMT
	I0831 16:07:04.042621    5342 round_trippers.go:580]     Audit-Id: a8a8b1b8-586d-4153-9a1e-27cfddbc15e2
	I0831 16:07:04.042883    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"1023","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{
"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-att [truncated 3925 chars]
	I0831 16:07:04.043120    5342 pod_ready.go:93] pod "kube-proxy-cplv4" in "kube-system" namespace has status "Ready":"True"
	I0831 16:07:04.043135    5342 pod_ready.go:82] duration metric: took 383.272182ms for pod "kube-proxy-cplv4" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:04.043143    5342 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-ndfs6" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:04.240079    5342 request.go:632] Waited for 196.845953ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-ndfs6
	I0831 16:07:04.240127    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-ndfs6
	I0831 16:07:04.240135    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:04.240146    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:04.240152    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:04.242870    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:04.242883    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:04.242890    5342 round_trippers.go:580]     Audit-Id: 95b881c6-a840-43a3-8838-3e82ae4f2d4f
	I0831 16:07:04.242914    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:04.242921    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:04.242925    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:04.242928    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:04.242932    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:04 GMT
	I0831 16:07:04.243101    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-proxy-ndfs6","generateName":"kube-proxy-","namespace":"kube-system","uid":"34c16419-4c10-41bd-9446-75ba130cbe63","resourceVersion":"911","creationTimestamp":"2024-08-31T22:59:10Z","labels":{"controller-revision-hash":"5976bc5f75","k8s-app":"kube-proxy","pod-template-generation":"1"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"DaemonSet","name":"kube-proxy","uid":"7b2d5815-fd80-401f-9040-ee043a6144ec","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:59:10Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:controller-revision-hash":{},"f:k8s-app":{},"f:pod-template-generation":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"7b2d5815-fd80-401f-9040-ee043a6144ec\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:nodeAffinity":{".":{},"f:r
equiredDuringSchedulingIgnoredDuringExecution":{}}},"f:containers":{"k: [truncated 6422 chars]
	I0831 16:07:04.438324    5342 request.go:632] Waited for 194.866756ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:04.438378    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:04.438387    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:04.438399    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:04.438409    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:04.440959    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:04.440978    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:04.440985    5342 round_trippers.go:580]     Audit-Id: 79a1092b-0d8c-48ab-9f19-27bd95c598b2
	I0831 16:07:04.440989    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:04.441002    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:04.441007    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:04.441012    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:04.441016    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:04 GMT
	I0831 16:07:04.441115    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"0867ece2-944d-429d-b3c6-0eab243276ee","resourceVersion":"928","creationTimestamp":"2024-08-31T23:00:04Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_00_04_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubeadm","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:00:04Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"
f:annotations":{"f:kubeadm.alpha.kubernetes.io/cri-socket":{}}}}},{"man [truncated 4391 chars]
	I0831 16:07:04.441351    5342 pod_ready.go:98] node "multinode-957000-m03" hosting pod "kube-proxy-ndfs6" in "kube-system" namespace is currently not "Ready" (skipping!): node "multinode-957000-m03" has status "Ready":"Unknown"
	I0831 16:07:04.441364    5342 pod_ready.go:82] duration metric: took 398.213132ms for pod "kube-proxy-ndfs6" in "kube-system" namespace to be "Ready" ...
	E0831 16:07:04.441372    5342 pod_ready.go:67] WaitExtra: waitPodCondition: node "multinode-957000-m03" hosting pod "kube-proxy-ndfs6" in "kube-system" namespace is currently not "Ready" (skipping!): node "multinode-957000-m03" has status "Ready":"Unknown"
	I0831 16:07:04.441380    5342 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-zf7j6" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:04.638960    5342 request.go:632] Waited for 197.459195ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zf7j6
	I0831 16:07:04.639055    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zf7j6
	I0831 16:07:04.639064    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:04.639077    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:04.639085    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:04.641562    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:04.641574    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:04.641581    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:04.641588    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:04.641591    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:04.641595    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:04 GMT
	I0831 16:07:04.641600    5342 round_trippers.go:580]     Audit-Id: 6d2f3fb0-787e-4eae-bb38-c5001bc991b8
	I0831 16:07:04.641604    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:04.641764    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-proxy-zf7j6","generateName":"kube-proxy-","namespace":"kube-system","uid":"e84c5d55-f27d-4d2a-9b41-6f1e6100ad2e","resourceVersion":"756","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"controller-revision-hash":"5976bc5f75","k8s-app":"kube-proxy","pod-template-generation":"1"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"DaemonSet","name":"kube-proxy","uid":"7b2d5815-fd80-401f-9040-ee043a6144ec","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:controller-revision-hash":{},"f:k8s-app":{},"f:pod-template-generation":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"7b2d5815-fd80-401f-9040-ee043a6144ec\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:nodeAffinity":{".":{},"f:r
equiredDuringSchedulingIgnoredDuringExecution":{}}},"f:containers":{"k: [truncated 6394 chars]
	I0831 16:07:04.838720    5342 request.go:632] Waited for 196.55257ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:07:04.838787    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:07:04.838796    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:04.838807    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:04.838815    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:04.841222    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:04.841239    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:04.841246    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:04.841251    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:04.841255    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:04.841258    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:04.841262    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:05 GMT
	I0831 16:07:04.841275    5342 round_trippers.go:580]     Audit-Id: ea1cbb57-10fc-452d-a810-9e216e155569
	I0831 16:07:04.841392    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:07:04.841662    5342 pod_ready.go:93] pod "kube-proxy-zf7j6" in "kube-system" namespace has status "Ready":"True"
	I0831 16:07:04.841674    5342 pod_ready.go:82] duration metric: took 400.283562ms for pod "kube-proxy-zf7j6" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:04.841683    5342 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:05.039130    5342 request.go:632] Waited for 197.406644ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-multinode-957000
	I0831 16:07:05.039163    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-multinode-957000
	I0831 16:07:05.039168    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:05.039174    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:05.039178    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:05.040420    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:07:05.040432    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:05.040438    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:05.040442    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:05.040446    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:05.040449    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:05.040454    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:05 GMT
	I0831 16:07:05.040458    5342 round_trippers.go:580]     Audit-Id: d9200e7d-85f7-41a2-a8a2-03c354ddc39e
	I0831 16:07:05.040534    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-scheduler-multinode-957000","namespace":"kube-system","uid":"f48d9647-8460-48da-a5b0-fc471f5536ad","resourceVersion":"847","creationTimestamp":"2024-08-31T22:57:31Z","labels":{"component":"kube-scheduler","tier":"control-plane"},"annotations":{"kubernetes.io/config.hash":"b74e8393ad84ccbcf23f7560eda422b0","kubernetes.io/config.mirror":"b74e8393ad84ccbcf23f7560eda422b0","kubernetes.io/config.seen":"2024-08-31T22:57:31.349646560Z","kubernetes.io/config.source":"file"},"ownerReferences":[{"apiVersion":"v1","kind":"Node","name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","controller":true}],"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:31Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:annotations":{".":{},"f:kubernetes.io/config.hash":{},"f:kubernetes.io/config.mirror":{},"f:kubernetes.io/config.seen":{},
"f:kubernetes.io/config.source":{}},"f:labels":{".":{},"f:component":{} [truncated 5194 chars]
	I0831 16:07:05.238178    5342 request.go:632] Waited for 197.399431ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:07:05.238283    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:07:05.238294    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:05.238304    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:05.238310    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:05.240769    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:05.240786    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:05.240795    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:05.240800    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:05.240806    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:05.240812    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:05 GMT
	I0831 16:07:05.240820    5342 round_trippers.go:580]     Audit-Id: a69cb8d9-30ca-45bb-a1b0-ba61898e828a
	I0831 16:07:05.240825    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:05.240970    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:07:05.241226    5342 pod_ready.go:93] pod "kube-scheduler-multinode-957000" in "kube-system" namespace has status "Ready":"True"
	I0831 16:07:05.241237    5342 pod_ready.go:82] duration metric: took 399.544727ms for pod "kube-scheduler-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:05.241246    5342 pod_ready.go:39] duration metric: took 1.601371753s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 16:07:05.241262    5342 system_svc.go:44] waiting for kubelet service to be running ....
	I0831 16:07:05.241319    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 16:07:05.253348    5342 system_svc.go:56] duration metric: took 12.083812ms WaitForService to wait for kubelet
	I0831 16:07:05.253362    5342 kubeadm.go:582] duration metric: took 15.817898134s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 16:07:05.253374    5342 node_conditions.go:102] verifying NodePressure condition ...
	I0831 16:07:05.440111    5342 request.go:632] Waited for 186.657614ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/nodes
	I0831 16:07:05.440161    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes
	I0831 16:07:05.440170    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:05.440181    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:05.440191    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:05.443830    5342 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 16:07:05.443848    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:05.443855    5342 round_trippers.go:580]     Audit-Id: e69994b7-5260-4717-a5b1-12a452b340b0
	I0831 16:07:05.443859    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:05.443908    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:05.443916    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:05.443920    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:05.443926    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:05 GMT
	I0831 16:07:05.444097    5342 request.go:1351] Response Body: {"kind":"NodeList","apiVersion":"v1","metadata":{"resourceVersion":"1028"},"items":[{"metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFie
lds":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time [truncated 15399 chars]
	I0831 16:07:05.444554    5342 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 16:07:05.444564    5342 node_conditions.go:123] node cpu capacity is 2
	I0831 16:07:05.444571    5342 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 16:07:05.444574    5342 node_conditions.go:123] node cpu capacity is 2
	I0831 16:07:05.444578    5342 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 16:07:05.444583    5342 node_conditions.go:123] node cpu capacity is 2
	I0831 16:07:05.444586    5342 node_conditions.go:105] duration metric: took 191.207656ms to run NodePressure ...
	I0831 16:07:05.444595    5342 start.go:241] waiting for startup goroutines ...
	I0831 16:07:05.444616    5342 start.go:255] writing updated cluster config ...
	I0831 16:07:05.466487    5342 out.go:201] 
	I0831 16:07:05.489435    5342 config.go:182] Loaded profile config "multinode-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 16:07:05.489568    5342 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/config.json ...
	I0831 16:07:05.512156    5342 out.go:177] * Starting "multinode-957000-m03" worker node in "multinode-957000" cluster
	I0831 16:07:05.571067    5342 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 16:07:05.571124    5342 cache.go:56] Caching tarball of preloaded images
	I0831 16:07:05.571331    5342 preload.go:172] Found /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0831 16:07:05.571352    5342 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0 on docker
	I0831 16:07:05.571478    5342 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/config.json ...
	I0831 16:07:05.572289    5342 start.go:360] acquireMachinesLock for multinode-957000-m03: {Name:mk22fcfd9fcc041836c9a4914a62f1ad6b78db01 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0831 16:07:05.572408    5342 start.go:364] duration metric: took 94.263µs to acquireMachinesLock for "multinode-957000-m03"
	I0831 16:07:05.572432    5342 start.go:96] Skipping create...Using existing machine configuration
	I0831 16:07:05.572439    5342 fix.go:54] fixHost starting: m03
	I0831 16:07:05.572848    5342 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:05.572871    5342 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:05.582025    5342 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53204
	I0831 16:07:05.582386    5342 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:05.582717    5342 main.go:141] libmachine: Using API Version  1
	I0831 16:07:05.582727    5342 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:05.582936    5342 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:05.583088    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .DriverName
	I0831 16:07:05.583191    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetState
	I0831 16:07:05.583278    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:07:05.583367    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | hyperkit pid from json: 4887
	I0831 16:07:05.584294    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | hyperkit pid 4887 missing from process table
	I0831 16:07:05.584323    5342 fix.go:112] recreateIfNeeded on multinode-957000-m03: state=Stopped err=<nil>
	I0831 16:07:05.584330    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .DriverName
	W0831 16:07:05.584413    5342 fix.go:138] unexpected machine state, will restart: <nil>
	I0831 16:07:05.606194    5342 out.go:177] * Restarting existing hyperkit VM for "multinode-957000-m03" ...
	I0831 16:07:05.647956    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .Start
	I0831 16:07:05.648225    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:07:05.648276    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/hyperkit.pid
	I0831 16:07:05.648338    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | Using UUID 9306fe61-41f6-4071-8737-9a8c8096e22e
	I0831 16:07:05.675985    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | Generated MAC 66:8:49:a7:32:97
	I0831 16:07:05.676007    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=multinode-957000
	I0831 16:07:05.676129    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:05 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"9306fe61-41f6-4071-8737-9a8c8096e22e", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00029b860)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", pr
ocess:(*os.Process)(nil)}
	I0831 16:07:05.676164    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:05 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"9306fe61-41f6-4071-8737-9a8c8096e22e", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00029b860)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", pr
ocess:(*os.Process)(nil)}
	I0831 16:07:05.676201    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:05 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "9306fe61-41f6-4071-8737-9a8c8096e22e", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/multinode-957000-m03.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/bzimage,/Users/jenkins
/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=multinode-957000"}
	I0831 16:07:05.676251    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:05 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 9306fe61-41f6-4071-8737-9a8c8096e22e -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/multinode-957000-m03.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/tty,log=/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/console-ring -f kexec,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/bzimage,/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-9
57000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=multinode-957000"
	I0831 16:07:05.676267    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:05 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0831 16:07:05.677648    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:05 DEBUG: hyperkit: Pid is 5425
	I0831 16:07:05.678062    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | Attempt 0
	I0831 16:07:05.678072    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:07:05.678130    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | hyperkit pid from json: 5425
	I0831 16:07:05.679216    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | Searching for 66:8:49:a7:32:97 in /var/db/dhcpd_leases ...
	I0831 16:07:05.679286    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | Found 14 entries in /var/db/dhcpd_leases!
	I0831 16:07:05.679307    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:6:27:eb:c0:a3:31 ID:1,6:27:eb:c0:a3:31 Lease:0x66d4f363}
	I0831 16:07:05.679342    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:52:11:67:f6:63:f1 ID:1,52:11:67:f6:63:f1 Lease:0x66d4f311}
	I0831 16:07:05.679382    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:66:8:49:a7:32:97 ID:1,66:8:49:a7:32:97 Lease:0x66d3a08a}
	I0831 16:07:05.679399    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | Found match: 66:8:49:a7:32:97
	I0831 16:07:05.679402    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetConfigRaw
	I0831 16:07:05.679426    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | IP: 192.169.0.15
	I0831 16:07:05.680134    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetIP
	I0831 16:07:05.680350    5342 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/config.json ...
	I0831 16:07:05.680783    5342 machine.go:93] provisionDockerMachine start ...
	I0831 16:07:05.680795    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .DriverName
	I0831 16:07:05.680912    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHHostname
	I0831 16:07:05.681031    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHPort
	I0831 16:07:05.681142    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHKeyPath
	I0831 16:07:05.681263    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHKeyPath
	I0831 16:07:05.681366    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHUsername
	I0831 16:07:05.681501    5342 main.go:141] libmachine: Using SSH client type: native
	I0831 16:07:05.681647    5342 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1e70ea0] 0x1e73c00 <nil>  [] 0s} 192.169.0.15 22 <nil> <nil>}
	I0831 16:07:05.681656    5342 main.go:141] libmachine: About to run SSH command:
	hostname
	I0831 16:07:05.685552    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:05 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0831 16:07:05.693750    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:05 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0831 16:07:05.694728    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:05 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:07:05.694742    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:05 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:07:05.694760    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:05 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:07:05.694768    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:05 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:07:06.081910    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:06 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0831 16:07:06.081925    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:06 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0831 16:07:06.197153    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:06 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0831 16:07:06.197173    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:06 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0831 16:07:06.197198    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:06 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0831 16:07:06.197214    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:06 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0831 16:07:06.198064    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:06 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0831 16:07:06.198081    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:06 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0831 16:07:11.785143    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:11 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0831 16:07:11.785196    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:11 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0831 16:07:11.785205    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:11 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0831 16:07:11.809032    5342 main.go:141] libmachine: (multinode-957000-m03) DBG | 2024/08/31 16:07:11 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0831 16:07:16.748413    5342 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0831 16:07:16.748428    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetMachineName
	I0831 16:07:16.748580    5342 buildroot.go:166] provisioning hostname "multinode-957000-m03"
	I0831 16:07:16.748594    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetMachineName
	I0831 16:07:16.748681    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHHostname
	I0831 16:07:16.748782    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHPort
	I0831 16:07:16.748866    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHKeyPath
	I0831 16:07:16.748954    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHKeyPath
	I0831 16:07:16.749050    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHUsername
	I0831 16:07:16.749178    5342 main.go:141] libmachine: Using SSH client type: native
	I0831 16:07:16.749319    5342 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1e70ea0] 0x1e73c00 <nil>  [] 0s} 192.169.0.15 22 <nil> <nil>}
	I0831 16:07:16.749327    5342 main.go:141] libmachine: About to run SSH command:
	sudo hostname multinode-957000-m03 && echo "multinode-957000-m03" | sudo tee /etc/hostname
	I0831 16:07:16.819789    5342 main.go:141] libmachine: SSH cmd err, output: <nil>: multinode-957000-m03
	
	I0831 16:07:16.819803    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHHostname
	I0831 16:07:16.819936    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHPort
	I0831 16:07:16.820049    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHKeyPath
	I0831 16:07:16.820129    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHKeyPath
	I0831 16:07:16.820217    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHUsername
	I0831 16:07:16.820344    5342 main.go:141] libmachine: Using SSH client type: native
	I0831 16:07:16.820483    5342 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1e70ea0] 0x1e73c00 <nil>  [] 0s} 192.169.0.15 22 <nil> <nil>}
	I0831 16:07:16.820495    5342 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\smultinode-957000-m03' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 multinode-957000-m03/g' /etc/hosts;
				else 
					echo '127.0.1.1 multinode-957000-m03' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0831 16:07:16.886464    5342 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0831 16:07:16.886480    5342 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/18943-957/.minikube CaCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/18943-957/.minikube}
	I0831 16:07:16.886489    5342 buildroot.go:174] setting up certificates
	I0831 16:07:16.886500    5342 provision.go:84] configureAuth start
	I0831 16:07:16.886507    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetMachineName
	I0831 16:07:16.886633    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetIP
	I0831 16:07:16.886735    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHHostname
	I0831 16:07:16.886815    5342 provision.go:143] copyHostCerts
	I0831 16:07:16.886842    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 16:07:16.886906    5342 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem, removing ...
	I0831 16:07:16.886912    5342 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem
	I0831 16:07:16.887070    5342 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/key.pem (1675 bytes)
	I0831 16:07:16.887270    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 16:07:16.887309    5342 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem, removing ...
	I0831 16:07:16.887314    5342 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem
	I0831 16:07:16.887392    5342 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/ca.pem (1082 bytes)
	I0831 16:07:16.887546    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 16:07:16.887587    5342 exec_runner.go:144] found /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem, removing ...
	I0831 16:07:16.887592    5342 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem
	I0831 16:07:16.887668    5342 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/18943-957/.minikube/cert.pem (1123 bytes)
	I0831 16:07:16.887815    5342 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem org=jenkins.multinode-957000-m03 san=[127.0.0.1 192.169.0.15 localhost minikube multinode-957000-m03]
	I0831 16:07:17.030060    5342 provision.go:177] copyRemoteCerts
	I0831 16:07:17.030117    5342 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0831 16:07:17.030131    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHHostname
	I0831 16:07:17.030314    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHPort
	I0831 16:07:17.030404    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHKeyPath
	I0831 16:07:17.030489    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHUsername
	I0831 16:07:17.030576    5342 sshutil.go:53] new ssh client: &{IP:192.169.0.15 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/id_rsa Username:docker}
	I0831 16:07:17.067959    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0831 16:07:17.068032    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0831 16:07:17.087656    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0831 16:07:17.087730    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server.pem --> /etc/docker/server.pem (1229 bytes)
	I0831 16:07:17.107351    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0831 16:07:17.107417    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0831 16:07:17.126384    5342 provision.go:87] duration metric: took 239.875281ms to configureAuth
	I0831 16:07:17.126403    5342 buildroot.go:189] setting minikube options for container-runtime
	I0831 16:07:17.126571    5342 config.go:182] Loaded profile config "multinode-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 16:07:17.126599    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .DriverName
	I0831 16:07:17.126728    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHHostname
	I0831 16:07:17.126822    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHPort
	I0831 16:07:17.126894    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHKeyPath
	I0831 16:07:17.126977    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHKeyPath
	I0831 16:07:17.127057    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHUsername
	I0831 16:07:17.127163    5342 main.go:141] libmachine: Using SSH client type: native
	I0831 16:07:17.127293    5342 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1e70ea0] 0x1e73c00 <nil>  [] 0s} 192.169.0.15 22 <nil> <nil>}
	I0831 16:07:17.127301    5342 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0831 16:07:17.188817    5342 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0831 16:07:17.188828    5342 buildroot.go:70] root file system type: tmpfs
	I0831 16:07:17.188895    5342 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0831 16:07:17.188905    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHHostname
	I0831 16:07:17.189031    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHPort
	I0831 16:07:17.189126    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHKeyPath
	I0831 16:07:17.189229    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHKeyPath
	I0831 16:07:17.189321    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHUsername
	I0831 16:07:17.189450    5342 main.go:141] libmachine: Using SSH client type: native
	I0831 16:07:17.189589    5342 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1e70ea0] 0x1e73c00 <nil>  [] 0s} 192.169.0.15 22 <nil> <nil>}
	I0831 16:07:17.189637    5342 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.13"
	Environment="NO_PROXY=192.169.0.13,192.169.0.14"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0831 16:07:17.260225    5342 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.13
	Environment=NO_PROXY=192.169.0.13,192.169.0.14
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0831 16:07:17.260248    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHHostname
	I0831 16:07:17.260383    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHPort
	I0831 16:07:17.260466    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHKeyPath
	I0831 16:07:17.260549    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHKeyPath
	I0831 16:07:17.260643    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHUsername
	I0831 16:07:17.260773    5342 main.go:141] libmachine: Using SSH client type: native
	I0831 16:07:17.260921    5342 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1e70ea0] 0x1e73c00 <nil>  [] 0s} 192.169.0.15 22 <nil> <nil>}
	I0831 16:07:17.260934    5342 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0831 16:07:18.819470    5342 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0831 16:07:18.819485    5342 machine.go:96] duration metric: took 13.138616288s to provisionDockerMachine
	I0831 16:07:18.819493    5342 start.go:293] postStartSetup for "multinode-957000-m03" (driver="hyperkit")
	I0831 16:07:18.819505    5342 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0831 16:07:18.819518    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .DriverName
	I0831 16:07:18.819727    5342 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0831 16:07:18.819739    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHHostname
	I0831 16:07:18.819838    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHPort
	I0831 16:07:18.819931    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHKeyPath
	I0831 16:07:18.820013    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHUsername
	I0831 16:07:18.820095    5342 sshutil.go:53] new ssh client: &{IP:192.169.0.15 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/id_rsa Username:docker}
	I0831 16:07:18.861961    5342 ssh_runner.go:195] Run: cat /etc/os-release
	I0831 16:07:18.865387    5342 command_runner.go:130] > NAME=Buildroot
	I0831 16:07:18.865396    5342 command_runner.go:130] > VERSION=2023.02.9-dirty
	I0831 16:07:18.865401    5342 command_runner.go:130] > ID=buildroot
	I0831 16:07:18.865407    5342 command_runner.go:130] > VERSION_ID=2023.02.9
	I0831 16:07:18.865415    5342 command_runner.go:130] > PRETTY_NAME="Buildroot 2023.02.9"
	I0831 16:07:18.865653    5342 info.go:137] Remote host: Buildroot 2023.02.9
	I0831 16:07:18.865664    5342 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/addons for local assets ...
	I0831 16:07:18.865764    5342 filesync.go:126] Scanning /Users/jenkins/minikube-integration/18943-957/.minikube/files for local assets ...
	I0831 16:07:18.865938    5342 filesync.go:149] local asset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> 14832.pem in /etc/ssl/certs
	I0831 16:07:18.865944    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /etc/ssl/certs/14832.pem
	I0831 16:07:18.866146    5342 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0831 16:07:18.877587    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /etc/ssl/certs/14832.pem (1708 bytes)
	I0831 16:07:18.907903    5342 start.go:296] duration metric: took 88.398003ms for postStartSetup
	I0831 16:07:18.907926    5342 fix.go:56] duration metric: took 13.335408836s for fixHost
	I0831 16:07:18.907940    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHHostname
	I0831 16:07:18.908075    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHPort
	I0831 16:07:18.908171    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHKeyPath
	I0831 16:07:18.908258    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHKeyPath
	I0831 16:07:18.908342    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHUsername
	I0831 16:07:18.908450    5342 main.go:141] libmachine: Using SSH client type: native
	I0831 16:07:18.908583    5342 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x1e70ea0] 0x1e73c00 <nil>  [] 0s} 192.169.0.15 22 <nil> <nil>}
	I0831 16:07:18.908590    5342 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0831 16:07:18.968165    5342 main.go:141] libmachine: SSH cmd err, output: <nil>: 1725145638.931255763
	
	I0831 16:07:18.968176    5342 fix.go:216] guest clock: 1725145638.931255763
	I0831 16:07:18.968181    5342 fix.go:229] Guest: 2024-08-31 16:07:18.931255763 -0700 PDT Remote: 2024-08-31 16:07:18.907931 -0700 PDT m=+159.108908269 (delta=23.324763ms)
	I0831 16:07:18.968194    5342 fix.go:200] guest clock delta is within tolerance: 23.324763ms
	I0831 16:07:18.968198    5342 start.go:83] releasing machines lock for "multinode-957000-m03", held for 13.395701477s
	I0831 16:07:18.968214    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .DriverName
	I0831 16:07:18.968324    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetIP
	I0831 16:07:18.994426    5342 out.go:177] * Found network options:
	I0831 16:07:19.015254    5342 out.go:177]   - NO_PROXY=192.169.0.13,192.169.0.14
	W0831 16:07:19.037424    5342 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 16:07:19.037457    5342 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 16:07:19.037476    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .DriverName
	I0831 16:07:19.038414    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .DriverName
	I0831 16:07:19.038738    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .DriverName
	I0831 16:07:19.038866    5342 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0831 16:07:19.038908    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHHostname
	W0831 16:07:19.039011    5342 proxy.go:119] fail to check proxy env: Error ip not in block
	W0831 16:07:19.039034    5342 proxy.go:119] fail to check proxy env: Error ip not in block
	I0831 16:07:19.039138    5342 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0831 16:07:19.039161    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHHostname
	I0831 16:07:19.039224    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHPort
	I0831 16:07:19.039407    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHPort
	I0831 16:07:19.039570    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHKeyPath
	I0831 16:07:19.039619    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHKeyPath
	I0831 16:07:19.039752    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHUsername
	I0831 16:07:19.039777    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHUsername
	I0831 16:07:19.039928    5342 sshutil.go:53] new ssh client: &{IP:192.169.0.15 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/id_rsa Username:docker}
	I0831 16:07:19.039960    5342 sshutil.go:53] new ssh client: &{IP:192.169.0.15 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/id_rsa Username:docker}
	I0831 16:07:19.114858    5342 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I0831 16:07:19.115642    5342 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W0831 16:07:19.115675    5342 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0831 16:07:19.115759    5342 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0831 16:07:19.130365    5342 command_runner.go:139] > /etc/cni/net.d/87-podman-bridge.conflist, 
	I0831 16:07:19.130447    5342 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0831 16:07:19.130457    5342 start.go:495] detecting cgroup driver to use...
	I0831 16:07:19.130528    5342 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 16:07:19.145049    5342 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I0831 16:07:19.145376    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10"|' /etc/containerd/config.toml"
	I0831 16:07:19.154319    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0831 16:07:19.163125    5342 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0831 16:07:19.163171    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0831 16:07:19.172220    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 16:07:19.181284    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0831 16:07:19.190114    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0831 16:07:19.199111    5342 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0831 16:07:19.208322    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0831 16:07:19.217238    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0831 16:07:19.226301    5342 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0831 16:07:19.235213    5342 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0831 16:07:19.243148    5342 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I0831 16:07:19.243297    5342 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0831 16:07:19.251609    5342 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 16:07:19.348854    5342 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0831 16:07:19.368250    5342 start.go:495] detecting cgroup driver to use...
	I0831 16:07:19.368322    5342 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0831 16:07:19.386495    5342 command_runner.go:130] > # /usr/lib/systemd/system/docker.service
	I0831 16:07:19.387061    5342 command_runner.go:130] > [Unit]
	I0831 16:07:19.387070    5342 command_runner.go:130] > Description=Docker Application Container Engine
	I0831 16:07:19.387075    5342 command_runner.go:130] > Documentation=https://docs.docker.com
	I0831 16:07:19.387081    5342 command_runner.go:130] > After=network.target  minikube-automount.service docker.socket
	I0831 16:07:19.387086    5342 command_runner.go:130] > Requires= minikube-automount.service docker.socket 
	I0831 16:07:19.387090    5342 command_runner.go:130] > StartLimitBurst=3
	I0831 16:07:19.387094    5342 command_runner.go:130] > StartLimitIntervalSec=60
	I0831 16:07:19.387098    5342 command_runner.go:130] > [Service]
	I0831 16:07:19.387101    5342 command_runner.go:130] > Type=notify
	I0831 16:07:19.387105    5342 command_runner.go:130] > Restart=on-failure
	I0831 16:07:19.387109    5342 command_runner.go:130] > Environment=NO_PROXY=192.169.0.13
	I0831 16:07:19.387114    5342 command_runner.go:130] > Environment=NO_PROXY=192.169.0.13,192.169.0.14
	I0831 16:07:19.387120    5342 command_runner.go:130] > # This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	I0831 16:07:19.387129    5342 command_runner.go:130] > # The base configuration already specifies an 'ExecStart=...' command. The first directive
	I0831 16:07:19.387135    5342 command_runner.go:130] > # here is to clear out that command inherited from the base configuration. Without this,
	I0831 16:07:19.387141    5342 command_runner.go:130] > # the command from the base configuration and the command specified here are treated as
	I0831 16:07:19.387147    5342 command_runner.go:130] > # a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	I0831 16:07:19.387153    5342 command_runner.go:130] > # will catch this invalid input and refuse to start the service with an error like:
	I0831 16:07:19.387161    5342 command_runner.go:130] > #  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	I0831 16:07:19.387167    5342 command_runner.go:130] > # NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	I0831 16:07:19.387174    5342 command_runner.go:130] > # container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	I0831 16:07:19.387185    5342 command_runner.go:130] > ExecStart=
	I0831 16:07:19.387198    5342 command_runner.go:130] > ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	I0831 16:07:19.387203    5342 command_runner.go:130] > ExecReload=/bin/kill -s HUP $MAINPID
	I0831 16:07:19.387211    5342 command_runner.go:130] > # Having non-zero Limit*s causes performance problems due to accounting overhead
	I0831 16:07:19.387218    5342 command_runner.go:130] > # in the kernel. We recommend using cgroups to do container-local accounting.
	I0831 16:07:19.387222    5342 command_runner.go:130] > LimitNOFILE=infinity
	I0831 16:07:19.387225    5342 command_runner.go:130] > LimitNPROC=infinity
	I0831 16:07:19.387229    5342 command_runner.go:130] > LimitCORE=infinity
	I0831 16:07:19.387234    5342 command_runner.go:130] > # Uncomment TasksMax if your systemd version supports it.
	I0831 16:07:19.387241    5342 command_runner.go:130] > # Only systemd 226 and above support this version.
	I0831 16:07:19.387247    5342 command_runner.go:130] > TasksMax=infinity
	I0831 16:07:19.387250    5342 command_runner.go:130] > TimeoutStartSec=0
	I0831 16:07:19.387256    5342 command_runner.go:130] > # set delegate yes so that systemd does not reset the cgroups of docker containers
	I0831 16:07:19.387261    5342 command_runner.go:130] > Delegate=yes
	I0831 16:07:19.387270    5342 command_runner.go:130] > # kill only the docker process, not all processes in the cgroup
	I0831 16:07:19.387274    5342 command_runner.go:130] > KillMode=process
	I0831 16:07:19.387277    5342 command_runner.go:130] > [Install]
	I0831 16:07:19.387281    5342 command_runner.go:130] > WantedBy=multi-user.target
	I0831 16:07:19.387341    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 16:07:19.398642    5342 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0831 16:07:19.415286    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0831 16:07:19.426534    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 16:07:19.437432    5342 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0831 16:07:19.459973    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0831 16:07:19.470415    5342 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0831 16:07:19.485260    5342 command_runner.go:130] > runtime-endpoint: unix:///var/run/cri-dockerd.sock
	I0831 16:07:19.485328    5342 ssh_runner.go:195] Run: which cri-dockerd
	I0831 16:07:19.488029    5342 command_runner.go:130] > /usr/bin/cri-dockerd
	I0831 16:07:19.488143    5342 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0831 16:07:19.495336    5342 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (190 bytes)
	I0831 16:07:19.508763    5342 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0831 16:07:19.610104    5342 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0831 16:07:19.710425    5342 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0831 16:07:19.710453    5342 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0831 16:07:19.724467    5342 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 16:07:19.818684    5342 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0831 16:07:22.072677    5342 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.253954582s)
	I0831 16:07:22.072737    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0831 16:07:22.083340    5342 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0831 16:07:22.096495    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 16:07:22.107028    5342 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0831 16:07:22.197077    5342 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0831 16:07:22.295023    5342 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 16:07:22.397485    5342 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0831 16:07:22.411559    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0831 16:07:22.422669    5342 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 16:07:22.526280    5342 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0831 16:07:22.587471    5342 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0831 16:07:22.587549    5342 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0831 16:07:22.591779    5342 command_runner.go:130] >   File: /var/run/cri-dockerd.sock
	I0831 16:07:22.591792    5342 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I0831 16:07:22.591798    5342 command_runner.go:130] > Device: 0,22	Inode: 743         Links: 1
	I0831 16:07:22.591803    5342 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: ( 1000/  docker)
	I0831 16:07:22.591807    5342 command_runner.go:130] > Access: 2024-08-31 23:07:22.565005534 +0000
	I0831 16:07:22.591817    5342 command_runner.go:130] > Modify: 2024-08-31 23:07:22.565005534 +0000
	I0831 16:07:22.591822    5342 command_runner.go:130] > Change: 2024-08-31 23:07:22.567033139 +0000
	I0831 16:07:22.591825    5342 command_runner.go:130] >  Birth: -
	I0831 16:07:22.591939    5342 start.go:563] Will wait 60s for crictl version
	I0831 16:07:22.591991    5342 ssh_runner.go:195] Run: which crictl
	I0831 16:07:22.594916    5342 command_runner.go:130] > /usr/bin/crictl
	I0831 16:07:22.595096    5342 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0831 16:07:22.620993    5342 command_runner.go:130] > Version:  0.1.0
	I0831 16:07:22.621007    5342 command_runner.go:130] > RuntimeName:  docker
	I0831 16:07:22.621011    5342 command_runner.go:130] > RuntimeVersion:  27.2.0
	I0831 16:07:22.621015    5342 command_runner.go:130] > RuntimeApiVersion:  v1
	I0831 16:07:22.622615    5342 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.2.0
	RuntimeApiVersion:  v1
	I0831 16:07:22.622693    5342 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 16:07:22.640952    5342 command_runner.go:130] > 27.2.0
	I0831 16:07:22.642174    5342 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0831 16:07:22.658589    5342 command_runner.go:130] > 27.2.0
	I0831 16:07:22.681171    5342 out.go:235] * Preparing Kubernetes v1.31.0 on Docker 27.2.0 ...
	I0831 16:07:22.702843    5342 out.go:177]   - env NO_PROXY=192.169.0.13
	I0831 16:07:22.723892    5342 out.go:177]   - env NO_PROXY=192.169.0.13,192.169.0.14
	I0831 16:07:22.744942    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetIP
	I0831 16:07:22.745328    5342 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0831 16:07:22.749799    5342 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 16:07:22.760312    5342 mustload.go:65] Loading cluster: multinode-957000
	I0831 16:07:22.760491    5342 config.go:182] Loaded profile config "multinode-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 16:07:22.760714    5342 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:22.760736    5342 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:22.769484    5342 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53225
	I0831 16:07:22.769829    5342 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:22.770181    5342 main.go:141] libmachine: Using API Version  1
	I0831 16:07:22.770196    5342 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:22.770413    5342 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:22.770536    5342 main.go:141] libmachine: (multinode-957000) Calling .GetState
	I0831 16:07:22.770617    5342 main.go:141] libmachine: (multinode-957000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:07:22.770701    5342 main.go:141] libmachine: (multinode-957000) DBG | hyperkit pid from json: 5355
	I0831 16:07:22.771685    5342 host.go:66] Checking if "multinode-957000" exists ...
	I0831 16:07:22.771933    5342 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:22.771962    5342 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:22.780531    5342 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53227
	I0831 16:07:22.780877    5342 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:22.781225    5342 main.go:141] libmachine: Using API Version  1
	I0831 16:07:22.781238    5342 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:22.781491    5342 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:22.781637    5342 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:07:22.781760    5342 certs.go:68] Setting up /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000 for IP: 192.169.0.15
	I0831 16:07:22.781768    5342 certs.go:194] generating shared ca certs ...
	I0831 16:07:22.781783    5342 certs.go:226] acquiring lock for ca certs: {Name:mk4bcb4537fb3325fdef6a760db540f754137c29 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0831 16:07:22.781959    5342 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key
	I0831 16:07:22.782042    5342 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key
	I0831 16:07:22.782052    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0831 16:07:22.782080    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0831 16:07:22.782099    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0831 16:07:22.782122    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0831 16:07:22.782216    5342 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem (1338 bytes)
	W0831 16:07:22.782263    5342 certs.go:480] ignoring /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483_empty.pem, impossibly tiny 0 bytes
	I0831 16:07:22.782273    5342 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca-key.pem (1675 bytes)
	I0831 16:07:22.782312    5342 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/ca.pem (1082 bytes)
	I0831 16:07:22.782352    5342 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/cert.pem (1123 bytes)
	I0831 16:07:22.782383    5342 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/key.pem (1675 bytes)
	I0831 16:07:22.782445    5342 certs.go:484] found cert: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem (1708 bytes)
	I0831 16:07:22.782481    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem -> /usr/share/ca-certificates/14832.pem
	I0831 16:07:22.782502    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0831 16:07:22.782519    5342 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem -> /usr/share/ca-certificates/1483.pem
	I0831 16:07:22.782543    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0831 16:07:22.802480    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0831 16:07:22.821885    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0831 16:07:22.841048    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0831 16:07:22.860022    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/ssl/certs/14832.pem --> /usr/share/ca-certificates/14832.pem (1708 bytes)
	I0831 16:07:22.879161    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0831 16:07:22.898212    5342 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/18943-957/.minikube/certs/1483.pem --> /usr/share/ca-certificates/1483.pem (1338 bytes)
	I0831 16:07:22.917235    5342 ssh_runner.go:195] Run: openssl version
	I0831 16:07:22.921429    5342 command_runner.go:130] > OpenSSL 1.1.1w  11 Sep 2023
	I0831 16:07:22.921483    5342 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14832.pem && ln -fs /usr/share/ca-certificates/14832.pem /etc/ssl/certs/14832.pem"
	I0831 16:07:22.930586    5342 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14832.pem
	I0831 16:07:22.933734    5342 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 16:07:22.933922    5342 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Aug 31 22:23 /usr/share/ca-certificates/14832.pem
	I0831 16:07:22.933957    5342 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14832.pem
	I0831 16:07:22.937934    5342 command_runner.go:130] > 3ec20f2e
	I0831 16:07:22.938109    5342 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/14832.pem /etc/ssl/certs/3ec20f2e.0"
	I0831 16:07:22.947253    5342 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0831 16:07:22.956158    5342 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0831 16:07:22.959318    5342 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 16:07:22.959491    5342 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Aug 31 22:05 /usr/share/ca-certificates/minikubeCA.pem
	I0831 16:07:22.959536    5342 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0831 16:07:22.963477    5342 command_runner.go:130] > b5213941
	I0831 16:07:22.963645    5342 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0831 16:07:22.972674    5342 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1483.pem && ln -fs /usr/share/ca-certificates/1483.pem /etc/ssl/certs/1483.pem"
	I0831 16:07:22.981508    5342 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1483.pem
	I0831 16:07:22.984596    5342 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 16:07:22.984785    5342 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Aug 31 22:23 /usr/share/ca-certificates/1483.pem
	I0831 16:07:22.984823    5342 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1483.pem
	I0831 16:07:22.988776    5342 command_runner.go:130] > 51391683
	I0831 16:07:22.988984    5342 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1483.pem /etc/ssl/certs/51391683.0"
	I0831 16:07:22.997969    5342 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0831 16:07:23.000898    5342 command_runner.go:130] ! stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 16:07:23.000989    5342 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0831 16:07:23.001021    5342 kubeadm.go:934] updating node {m03 192.169.0.15 0 v1.31.0 docker false true} ...
	I0831 16:07:23.001096    5342 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.31.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=multinode-957000-m03 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.15
	
	[Install]
	 config:
	{KubernetesVersion:v1.31.0 ClusterName:multinode-957000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0831 16:07:23.001135    5342 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.31.0
	I0831 16:07:23.009036    5342 command_runner.go:130] > kubeadm
	I0831 16:07:23.009045    5342 command_runner.go:130] > kubectl
	I0831 16:07:23.009048    5342 command_runner.go:130] > kubelet
	I0831 16:07:23.009150    5342 binaries.go:44] Found k8s binaries, skipping transfer
	I0831 16:07:23.009207    5342 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system
	I0831 16:07:23.017007    5342 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (319 bytes)
	I0831 16:07:23.030276    5342 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0831 16:07:23.043653    5342 ssh_runner.go:195] Run: grep 192.169.0.13	control-plane.minikube.internal$ /etc/hosts
	I0831 16:07:23.046458    5342 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.13	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0831 16:07:23.056398    5342 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 16:07:23.157044    5342 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 16:07:23.171944    5342 host.go:66] Checking if "multinode-957000" exists ...
	I0831 16:07:23.172231    5342 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:23.172254    5342 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:23.181210    5342 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53229
	I0831 16:07:23.181589    5342 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:23.181943    5342 main.go:141] libmachine: Using API Version  1
	I0831 16:07:23.181956    5342 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:23.182155    5342 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:23.182272    5342 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:07:23.182362    5342 start.go:317] joinCluster: &{Name:multinode-957000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.3
1.0 ClusterName:multinode-957000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.13 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.14 Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true} {Name:m03 IP:192.169.0.15 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:f
alse inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOpt
imizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 16:07:23.182446    5342 start.go:330] removing existing worker node "m03" before attempting to rejoin cluster: &{Name:m03 IP:192.169.0.15 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}
	I0831 16:07:23.182465    5342 host.go:66] Checking if "multinode-957000-m03" exists ...
	I0831 16:07:23.182757    5342 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:23.182780    5342 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:23.191622    5342 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53231
	I0831 16:07:23.191976    5342 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:23.192327    5342 main.go:141] libmachine: Using API Version  1
	I0831 16:07:23.192339    5342 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:23.192574    5342 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:23.192708    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .DriverName
	I0831 16:07:23.192803    5342 mustload.go:65] Loading cluster: multinode-957000
	I0831 16:07:23.192980    5342 config.go:182] Loaded profile config "multinode-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 16:07:23.193198    5342 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:23.193223    5342 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:23.202117    5342 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53233
	I0831 16:07:23.202488    5342 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:23.202852    5342 main.go:141] libmachine: Using API Version  1
	I0831 16:07:23.202868    5342 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:23.203068    5342 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:23.203171    5342 main.go:141] libmachine: (multinode-957000) Calling .GetState
	I0831 16:07:23.203254    5342 main.go:141] libmachine: (multinode-957000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 16:07:23.203336    5342 main.go:141] libmachine: (multinode-957000) DBG | hyperkit pid from json: 5355
	I0831 16:07:23.204296    5342 host.go:66] Checking if "multinode-957000" exists ...
	I0831 16:07:23.204567    5342 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 16:07:23.204590    5342 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 16:07:23.213316    5342 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53235
	I0831 16:07:23.213675    5342 main.go:141] libmachine: () Calling .GetVersion
	I0831 16:07:23.214016    5342 main.go:141] libmachine: Using API Version  1
	I0831 16:07:23.214025    5342 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 16:07:23.214263    5342 main.go:141] libmachine: () Calling .GetMachineName
	I0831 16:07:23.214388    5342 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 16:07:23.214482    5342 api_server.go:166] Checking apiserver status ...
	I0831 16:07:23.214535    5342 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 16:07:23.214546    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:07:23.214659    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:07:23.214743    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:07:23.214842    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:07:23.214934    5342 sshutil.go:53] new ssh client: &{IP:192.169.0.13 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/id_rsa Username:docker}
	I0831 16:07:23.256693    5342 command_runner.go:130] > 1696
	I0831 16:07:23.256763    5342 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1696/cgroup
	W0831 16:07:23.266914    5342 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1696/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 16:07:23.266967    5342 ssh_runner.go:195] Run: ls
	I0831 16:07:23.270970    5342 api_server.go:253] Checking apiserver healthz at https://192.169.0.13:8443/healthz ...
	I0831 16:07:23.274687    5342 api_server.go:279] https://192.169.0.13:8443/healthz returned 200:
	ok
	I0831 16:07:23.274760    5342 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.31.0/kubectl drain multinode-957000-m03 --force --grace-period=1 --skip-wait-for-delete-timeout=1 --disable-eviction --ignore-daemonsets --delete-emptydir-data
	I0831 16:07:23.408358    5342 command_runner.go:130] ! Warning: ignoring DaemonSet-managed Pods: kube-system/kindnet-cjqw5, kube-system/kube-proxy-ndfs6
	I0831 16:07:23.409556    5342 command_runner.go:130] > node/multinode-957000-m03 cordoned
	I0831 16:07:23.409568    5342 command_runner.go:130] > node/multinode-957000-m03 drained
	I0831 16:07:23.409579    5342 node.go:128] successfully drained node "multinode-957000-m03"
	I0831 16:07:23.409609    5342 ssh_runner.go:195] Run: /bin/bash -c "KUBECONFIG=/var/lib/minikube/kubeconfig sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm reset --force --ignore-preflight-errors=all --cri-socket=unix:///var/run/cri-dockerd.sock"
	I0831 16:07:23.409626    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHHostname
	I0831 16:07:23.409786    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHPort
	I0831 16:07:23.409891    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHKeyPath
	I0831 16:07:23.410007    5342 main.go:141] libmachine: (multinode-957000-m03) Calling .GetSSHUsername
	I0831 16:07:23.410097    5342 sshutil.go:53] new ssh client: &{IP:192.169.0.15 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m03/id_rsa Username:docker}
	I0831 16:07:23.507243    5342 command_runner.go:130] ! W0831 23:07:23.548554    1288 removeetcdmember.go:106] [reset] No kubeadm config, using etcd pod spec to get data directory
	I0831 16:07:23.553201    5342 command_runner.go:130] > [preflight] Running pre-flight checks
	I0831 16:07:23.553216    5342 command_runner.go:130] > [reset] Deleted contents of the etcd data directory: /var/lib/etcd
	I0831 16:07:23.553220    5342 command_runner.go:130] > [reset] Stopping the kubelet service
	I0831 16:07:23.553225    5342 command_runner.go:130] > [reset] Unmounting mounted directories in "/var/lib/kubelet"
	I0831 16:07:23.553238    5342 command_runner.go:130] > [reset] Deleting contents of directories: [/etc/kubernetes/manifests /var/lib/kubelet /etc/kubernetes/pki]
	I0831 16:07:23.553256    5342 command_runner.go:130] > [reset] Deleting files: [/etc/kubernetes/admin.conf /etc/kubernetes/super-admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/bootstrap-kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf]
	I0831 16:07:23.553264    5342 command_runner.go:130] > The reset process does not clean CNI configuration. To do so, you must remove /etc/cni/net.d
	I0831 16:07:23.553273    5342 command_runner.go:130] > The reset process does not reset or clean up iptables rules or IPVS tables.
	I0831 16:07:23.553279    5342 command_runner.go:130] > If you wish to reset iptables, you must do so manually by using the "iptables" command.
	I0831 16:07:23.553285    5342 command_runner.go:130] > If your cluster was setup to utilize IPVS, run ipvsadm --clear (or similar)
	I0831 16:07:23.553289    5342 command_runner.go:130] > to reset your system's IPVS tables.
	I0831 16:07:23.553295    5342 command_runner.go:130] > The reset process does not clean your kubeconfig files and you must remove them manually.
	I0831 16:07:23.553307    5342 command_runner.go:130] > Please, check the contents of the $HOME/.kube/config file.
	I0831 16:07:23.553319    5342 node.go:155] successfully reset node "multinode-957000-m03"
	I0831 16:07:23.553603    5342 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 16:07:23.553808    5342 kapi.go:59] client config for multinode-957000: &rest.Config{Host:"https://192.169.0.13:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProt
os:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x352cc00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0831 16:07:23.553999    5342 request.go:1351] Request Body: {"kind":"DeleteOptions","apiVersion":"v1"}
	I0831 16:07:23.554029    5342 round_trippers.go:463] DELETE https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:23.554033    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:23.554040    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:23.554044    5342 round_trippers.go:473]     Content-Type: application/json
	I0831 16:07:23.554047    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:23.557271    5342 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 16:07:23.557284    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:23.557289    5342 round_trippers.go:580]     Audit-Id: e44e5314-1adf-4b71-a26d-fd9459453bd7
	I0831 16:07:23.557292    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:23.557295    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:23.557297    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:23.557300    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:23.557314    5342 round_trippers.go:580]     Content-Length: 171
	I0831 16:07:23.557317    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:23 GMT
	I0831 16:07:23.557328    5342 request.go:1351] Response Body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Success","details":{"name":"multinode-957000-m03","kind":"nodes","uid":"0867ece2-944d-429d-b3c6-0eab243276ee"}}
	I0831 16:07:23.557352    5342 node.go:180] successfully deleted node "multinode-957000-m03"
	I0831 16:07:23.557359    5342 start.go:334] successfully removed existing worker node "m03" from cluster: &{Name:m03 IP:192.169.0.15 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}
	I0831 16:07:23.557377    5342 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm token create --print-join-command --ttl=0"
	I0831 16:07:23.557390    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 16:07:23.557539    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 16:07:23.557631    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 16:07:23.557722    5342 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 16:07:23.557824    5342 sshutil.go:53] new ssh client: &{IP:192.169.0.13 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/id_rsa Username:docker}
	I0831 16:07:23.632493    5342 command_runner.go:130] > kubeadm join control-plane.minikube.internal:8443 --token 2n2kvq.luu3dpp1ccc8wtsj --discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 
	I0831 16:07:23.633307    5342 start.go:343] trying to join worker node "m03" to cluster: &{Name:m03 IP:192.169.0.15 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}
	I0831 16:07:23.633329    5342 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm join control-plane.minikube.internal:8443 --token 2n2kvq.luu3dpp1ccc8wtsj --discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=multinode-957000-m03"
	I0831 16:07:23.663028    5342 command_runner.go:130] > [preflight] Running pre-flight checks
	I0831 16:07:23.741267    5342 command_runner.go:130] > [preflight] Reading configuration from the cluster...
	I0831 16:07:23.741284    5342 command_runner.go:130] > [preflight] FYI: You can look at this config file with 'kubectl -n kube-system get cm kubeadm-config -o yaml'
	I0831 16:07:23.772075    5342 command_runner.go:130] > [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0831 16:07:23.772091    5342 command_runner.go:130] > [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0831 16:07:23.772096    5342 command_runner.go:130] > [kubelet-start] Starting the kubelet
	I0831 16:07:23.885261    5342 command_runner.go:130] > [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I0831 16:07:24.384537    5342 command_runner.go:130] > [kubelet-check] The kubelet is healthy after 504.980251ms
	I0831 16:07:24.384563    5342 command_runner.go:130] > [kubelet-start] Waiting for the kubelet to perform the TLS Bootstrap
	I0831 16:07:24.889736    5342 command_runner.go:130] > This node has joined the cluster:
	I0831 16:07:24.889756    5342 command_runner.go:130] > * Certificate signing request was sent to apiserver and a response was received.
	I0831 16:07:24.889761    5342 command_runner.go:130] > * The Kubelet was informed of the new secure connection details.
	I0831 16:07:24.889767    5342 command_runner.go:130] > Run 'kubectl get nodes' on the control-plane to see this node join the cluster.
	I0831 16:07:24.891676    5342 command_runner.go:130] ! 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0831 16:07:24.891807    5342 ssh_runner.go:235] Completed: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.31.0:$PATH" kubeadm join control-plane.minikube.internal:8443 --token 2n2kvq.luu3dpp1ccc8wtsj --discovery-token-ca-cert-hash sha256:32dc7428c48563e1fc34d58e7581049a6ed795c09c71825e96f6f40c87bfc139 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=multinode-957000-m03": (1.258458941s)
	I0831 16:07:24.891819    5342 ssh_runner.go:195] Run: /bin/bash -c "sudo systemctl daemon-reload && sudo systemctl enable kubelet && sudo systemctl start kubelet"
	I0831 16:07:25.099741    5342 command_runner.go:130] ! Created symlink /etc/systemd/system/multi-user.target.wants/kubelet.service → /usr/lib/systemd/system/kubelet.service.
	I0831 16:07:25.099823    5342 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.31.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes multinode-957000-m03 minikube.k8s.io/updated_at=2024_08_31T16_07_25_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2 minikube.k8s.io/name=multinode-957000 minikube.k8s.io/primary=false
	I0831 16:07:25.159669    5342 command_runner.go:130] > node/multinode-957000-m03 labeled
	I0831 16:07:25.159825    5342 start.go:319] duration metric: took 1.977450125s to joinCluster
	I0831 16:07:25.159869    5342 start.go:235] Will wait 6m0s for node &{Name:m03 IP:192.169.0.15 Port:0 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:false Worker:true}
	I0831 16:07:25.160073    5342 config.go:182] Loaded profile config "multinode-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 16:07:25.180327    5342 out.go:177] * Verifying Kubernetes components...
	I0831 16:07:25.254133    5342 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0831 16:07:25.357029    5342 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0831 16:07:25.369258    5342 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 16:07:25.369459    5342 kapi.go:59] client config for multinode-957000: &rest.Config{Host:"https://192.169.0.13:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/profiles/multinode-957000/client.key", CAFile:"/Users/jenkins/minikube-integration/18943-957/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProt
os:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x352cc00), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0831 16:07:25.369644    5342 node_ready.go:35] waiting up to 6m0s for node "multinode-957000-m03" to be "Ready" ...
	I0831 16:07:25.369692    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:25.369698    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:25.369703    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:25.369708    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:25.371316    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:07:25.371325    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:25.371331    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:25.371348    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:25.371354    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:25 GMT
	I0831 16:07:25.371357    5342 round_trippers.go:580]     Audit-Id: a04723cd-d991-4bb3-98b0-4eb04fd0978a
	I0831 16:07:25.371359    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:25.371362    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:25.371453    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1083","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3498 chars]
	I0831 16:07:25.871916    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:25.871939    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:25.871950    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:25.871956    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:25.874471    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:25.874489    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:25.874500    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:26 GMT
	I0831 16:07:25.874508    5342 round_trippers.go:580]     Audit-Id: 27dc8bf7-473a-447f-bac8-efe438f5cd12
	I0831 16:07:25.874516    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:25.874522    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:25.874537    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:25.874543    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:25.874831    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1083","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3498 chars]
	I0831 16:07:26.369987    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:26.370008    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:26.370019    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:26.370025    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:26.372406    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:26.372422    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:26.372430    5342 round_trippers.go:580]     Audit-Id: d6c56144-5302-42ac-a6a0-3b5bc1cd62ac
	I0831 16:07:26.372435    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:26.372441    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:26.372446    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:26.372450    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:26.372455    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:26 GMT
	I0831 16:07:26.372528    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1083","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3498 chars]
	I0831 16:07:26.869843    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:26.869858    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:26.869865    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:26.869868    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:26.871256    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:07:26.871268    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:26.871273    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:27 GMT
	I0831 16:07:26.871277    5342 round_trippers.go:580]     Audit-Id: 8307ce0c-22b3-4274-bf09-ba55055c3896
	I0831 16:07:26.871279    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:26.871294    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:26.871300    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:26.871304    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:26.871563    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1083","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3498 chars]
	I0831 16:07:27.370223    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:27.370248    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:27.370259    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:27.370268    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:27.373819    5342 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 16:07:27.373829    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:27.373834    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:27.373838    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:27.373841    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:27.373844    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:27 GMT
	I0831 16:07:27.373847    5342 round_trippers.go:580]     Audit-Id: 72f1b419-07b5-4938-96ce-f19681543910
	I0831 16:07:27.373850    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:27.373909    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1083","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3498 chars]
	I0831 16:07:27.374075    5342 node_ready.go:53] node "multinode-957000-m03" has status "Ready":"False"
	I0831 16:07:27.870553    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:27.870578    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:27.870590    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:27.870599    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:27.873215    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:27.873231    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:27.873239    5342 round_trippers.go:580]     Audit-Id: aa468137-0ed9-4181-a987-e2152e0a8786
	I0831 16:07:27.873244    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:27.873266    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:27.873273    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:27.873276    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:27.873281    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:28 GMT
	I0831 16:07:27.873354    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1083","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3498 chars]
	I0831 16:07:28.370519    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:28.370539    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:28.370551    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:28.370555    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:28.373033    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:28.373048    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:28.373094    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:28.373103    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:28.373107    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:28.373110    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:28.373114    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:28 GMT
	I0831 16:07:28.373119    5342 round_trippers.go:580]     Audit-Id: af7bb55e-85fc-4dbc-8de1-4036e86aecf9
	I0831 16:07:28.373263    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1083","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3498 chars]
	I0831 16:07:28.871926    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:28.871955    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:28.871966    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:28.871971    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:28.874780    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:28.874797    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:28.874804    5342 round_trippers.go:580]     Audit-Id: 094ca1e7-1832-41a8-982b-f2dbcee57c71
	I0831 16:07:28.874808    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:28.874812    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:28.874816    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:28.874819    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:28.874823    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:29 GMT
	I0831 16:07:28.874888    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1083","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3498 chars]
	I0831 16:07:29.371521    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:29.371548    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:29.371565    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:29.371608    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:29.374213    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:29.374228    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:29.374236    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:29.374240    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:29.374243    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:29.374246    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:29.374250    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:29 GMT
	I0831 16:07:29.374254    5342 round_trippers.go:580]     Audit-Id: ecf29389-ae3d-46f7-81c7-bd77cf3899c0
	I0831 16:07:29.374340    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1083","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3498 chars]
	I0831 16:07:29.374566    5342 node_ready.go:53] node "multinode-957000-m03" has status "Ready":"False"
	I0831 16:07:29.871970    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:29.871992    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:29.872003    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:29.872008    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:29.874635    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:29.874652    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:29.874660    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:30 GMT
	I0831 16:07:29.874671    5342 round_trippers.go:580]     Audit-Id: 85250dd9-e799-4e4d-b7ce-5a8a5b75e300
	I0831 16:07:29.874683    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:29.874687    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:29.874691    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:29.874695    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:29.874764    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1083","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3498 chars]
	I0831 16:07:30.371063    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:30.371118    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:30.371153    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:30.371163    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:30.373148    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:07:30.373169    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:30.373177    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:30.373182    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:30.373186    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:30.373189    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:30 GMT
	I0831 16:07:30.373192    5342 round_trippers.go:580]     Audit-Id: 19696815-97aa-42d7-9257-e4ca4acb6dde
	I0831 16:07:30.373196    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:30.373400    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1083","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3498 chars]
	I0831 16:07:30.869871    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:30.869907    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:30.869918    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:30.869924    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:30.871966    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:30.871981    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:30.871990    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:30.871995    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:30.872010    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:30.872018    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:31 GMT
	I0831 16:07:30.872021    5342 round_trippers.go:580]     Audit-Id: ef4a1647-347a-41df-a192-f2c22fcaa172
	I0831 16:07:30.872025    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:30.872286    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1083","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3498 chars]
	I0831 16:07:31.371863    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:31.371917    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:31.371957    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:31.371972    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:31.374179    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:31.374193    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:31.374202    5342 round_trippers.go:580]     Audit-Id: e3aada0b-6346-4a94-8796-550c361e847c
	I0831 16:07:31.374207    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:31.374210    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:31.374216    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:31.374220    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:31.374224    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:31 GMT
	I0831 16:07:31.374350    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1083","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3498 chars]
	I0831 16:07:31.870217    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:31.870229    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:31.870236    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:31.870239    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:31.871810    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:07:31.871823    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:31.871830    5342 round_trippers.go:580]     Audit-Id: 6d85ec83-ae85-4292-a168-baa6e0638e41
	I0831 16:07:31.871835    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:31.871840    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:31.871845    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:31.871848    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:31.871850    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:32 GMT
	I0831 16:07:31.871985    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1083","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3498 chars]
	I0831 16:07:31.872160    5342 node_ready.go:53] node "multinode-957000-m03" has status "Ready":"False"
	I0831 16:07:32.371991    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:32.372017    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:32.372030    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:32.372035    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:32.374654    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:32.374670    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:32.374678    5342 round_trippers.go:580]     Audit-Id: 98cf6954-1d8e-4438-b7aa-a2a3fc11693b
	I0831 16:07:32.374682    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:32.374686    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:32.374689    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:32.374692    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:32.374696    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:32 GMT
	I0831 16:07:32.374770    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1083","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3498 chars]
	I0831 16:07:32.870907    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:32.870934    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:32.870947    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:32.870953    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:32.873564    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:32.873580    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:32.873587    5342 round_trippers.go:580]     Audit-Id: dddea0de-63fb-455c-b9c8-ff920a04e610
	I0831 16:07:32.873596    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:32.873602    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:32.873610    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:32.873615    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:32.873620    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:33 GMT
	I0831 16:07:32.873696    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1083","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3498 chars]
	I0831 16:07:33.371127    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:33.371145    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:33.371151    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:33.371155    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:33.372793    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:07:33.372804    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:33.372809    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:33.372826    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:33 GMT
	I0831 16:07:33.372832    5342 round_trippers.go:580]     Audit-Id: 90734097-a0d7-4482-953e-0708cc0d449d
	I0831 16:07:33.372835    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:33.372837    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:33.372839    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:33.372890    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1083","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3498 chars]
	I0831 16:07:33.872032    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:33.872056    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:33.872067    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:33.872074    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:33.874631    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:33.874646    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:33.874653    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:33.874658    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:33.874661    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:33.874664    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:33.874667    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:34 GMT
	I0831 16:07:33.874671    5342 round_trippers.go:580]     Audit-Id: 879bf077-a039-4f82-82c0-8d5cb1d2ce8d
	I0831 16:07:33.874787    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1083","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3498 chars]
	I0831 16:07:33.875012    5342 node_ready.go:53] node "multinode-957000-m03" has status "Ready":"False"
	I0831 16:07:34.371949    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:34.371976    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:34.371985    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:34.371992    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:34.374703    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:34.374720    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:34.374741    5342 round_trippers.go:580]     Audit-Id: 157c1399-f108-4962-b457-d007da4b541c
	I0831 16:07:34.374748    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:34.374753    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:34.374757    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:34.374760    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:34.374763    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:34 GMT
	I0831 16:07:34.374946    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1083","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3498 chars]
	I0831 16:07:34.871952    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:34.871973    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:34.871984    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:34.871992    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:34.874926    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:34.874943    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:34.874950    5342 round_trippers.go:580]     Audit-Id: a147798b-d785-4a60-90d9-6e2d47f36dac
	I0831 16:07:34.874962    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:34.874968    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:34.874972    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:34.874975    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:34.874981    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:35 GMT
	I0831 16:07:34.875104    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1106","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3890 chars]
	I0831 16:07:35.370351    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:35.370377    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:35.370389    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:35.370395    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:35.372926    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:35.372941    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:35.372950    5342 round_trippers.go:580]     Audit-Id: c2c2e740-c01b-4545-adca-e0a06c4f91a1
	I0831 16:07:35.372958    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:35.372963    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:35.372970    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:35.372981    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:35.372986    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:35 GMT
	I0831 16:07:35.373191    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1106","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3890 chars]
	I0831 16:07:35.871341    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:35.871360    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:35.871402    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:35.871409    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:35.873329    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:07:35.873344    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:35.873353    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:35.873357    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:35.873362    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:35.873378    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:36 GMT
	I0831 16:07:35.873381    5342 round_trippers.go:580]     Audit-Id: c028c066-b32a-425d-b001-85374a367c36
	I0831 16:07:35.873383    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:35.873438    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1106","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3890 chars]
	I0831 16:07:36.371533    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:36.371557    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:36.371569    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:36.371574    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:36.374505    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:36.374521    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:36.374529    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:36.374534    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:36 GMT
	I0831 16:07:36.374538    5342 round_trippers.go:580]     Audit-Id: 0bb49313-ccc5-42c9-9ca5-4f1d66365122
	I0831 16:07:36.374542    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:36.374545    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:36.374548    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:36.374717    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1106","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3890 chars]
	I0831 16:07:36.374945    5342 node_ready.go:53] node "multinode-957000-m03" has status "Ready":"False"
	I0831 16:07:36.871949    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:36.871976    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:36.871987    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:36.871996    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:36.874836    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:36.874852    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:36.874860    5342 round_trippers.go:580]     Audit-Id: 0faf9586-cd8b-4e9b-acff-7e37b7b288bd
	I0831 16:07:36.874864    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:36.874868    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:36.874871    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:36.874874    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:36.874876    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:37 GMT
	I0831 16:07:36.875233    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1106","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3890 chars]
	I0831 16:07:37.371558    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:37.371582    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:37.371594    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:37.371602    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:37.374220    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:37.374236    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:37.374243    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:37.374247    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:37.374251    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:37 GMT
	I0831 16:07:37.374255    5342 round_trippers.go:580]     Audit-Id: d40087f5-3264-4a24-a7fd-0a671aff9bbf
	I0831 16:07:37.374258    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:37.374262    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:37.374337    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1106","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3890 chars]
	I0831 16:07:37.871956    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:37.871986    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:37.871997    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:37.872005    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:37.874866    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:37.874897    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:37.874908    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:38 GMT
	I0831 16:07:37.874914    5342 round_trippers.go:580]     Audit-Id: 12a71b17-c542-4e95-b15b-37684ae403c6
	I0831 16:07:37.874919    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:37.874925    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:37.874931    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:37.874937    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:37.875016    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1106","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3890 chars]
	I0831 16:07:38.371787    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:38.371817    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:38.371863    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:38.371874    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:38.374884    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:38.374906    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:38.374916    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:38.374923    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:38.374931    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:38.374938    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:38.374946    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:38 GMT
	I0831 16:07:38.374953    5342 round_trippers.go:580]     Audit-Id: 5a7d021d-bffa-4af8-885a-39453935032e
	I0831 16:07:38.375437    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1106","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3890 chars]
	I0831 16:07:38.375702    5342 node_ready.go:53] node "multinode-957000-m03" has status "Ready":"False"
	I0831 16:07:38.871168    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:38.871191    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:38.871203    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:38.871212    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:38.874011    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:38.874029    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:38.874037    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:38.874050    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:39 GMT
	I0831 16:07:38.874055    5342 round_trippers.go:580]     Audit-Id: 21b1cdd4-2c97-4ef9-a069-5b51bf53bcaa
	I0831 16:07:38.874059    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:38.874062    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:38.874066    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:38.874148    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1106","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3890 chars]
	I0831 16:07:39.371973    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:39.372048    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:39.372062    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:39.372068    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:39.375728    5342 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0831 16:07:39.375747    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:39.375755    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:39.375759    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:39 GMT
	I0831 16:07:39.375762    5342 round_trippers.go:580]     Audit-Id: a38bbc47-5ce3-43f3-95f3-451e979c13e6
	I0831 16:07:39.375773    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:39.375777    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:39.375780    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:39.375853    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1106","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3890 chars]
	I0831 16:07:39.870309    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:39.870331    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:39.870343    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:39.870348    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:39.872776    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:39.872789    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:39.872796    5342 round_trippers.go:580]     Audit-Id: 010a5bf0-0c28-4c29-a897-9000bee7e407
	I0831 16:07:39.872801    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:39.872807    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:39.872813    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:39.872820    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:39.872826    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:40 GMT
	I0831 16:07:39.873106    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1106","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3890 chars]
	I0831 16:07:40.371998    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:40.372027    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:40.372038    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:40.372046    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:40.374836    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:40.374851    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:40.374858    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:40.374864    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:40 GMT
	I0831 16:07:40.374869    5342 round_trippers.go:580]     Audit-Id: 41c3348f-2ab7-46e9-9ce5-14efd850ff8d
	I0831 16:07:40.374872    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:40.374877    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:40.374882    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:40.374941    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1106","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3890 chars]
	I0831 16:07:40.870657    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:40.870674    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:40.870682    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:40.870686    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:40.872509    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:07:40.872542    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:40.872567    5342 round_trippers.go:580]     Audit-Id: 9fd323e5-1b7a-4de7-b291-4fa71e0a0d30
	I0831 16:07:40.872573    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:40.872577    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:40.872579    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:40.872582    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:40.872585    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:41 GMT
	I0831 16:07:40.872724    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1106","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3890 chars]
	I0831 16:07:40.872903    5342 node_ready.go:53] node "multinode-957000-m03" has status "Ready":"False"
	I0831 16:07:41.371156    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:41.371177    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:41.371188    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:41.371195    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:41.373897    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:41.373910    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:41.373917    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:41.373921    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:41.373925    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:41 GMT
	I0831 16:07:41.373929    5342 round_trippers.go:580]     Audit-Id: c9fe0646-f4af-4be9-a23c-71c375753991
	I0831 16:07:41.373933    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:41.373937    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:41.374297    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1106","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3890 chars]
	I0831 16:07:41.870253    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:41.870319    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:41.870337    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:41.870345    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:41.872684    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:41.872700    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:41.872708    5342 round_trippers.go:580]     Audit-Id: 5c4c8601-8a31-4855-81b2-fd30779bc2e0
	I0831 16:07:41.872714    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:41.872733    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:41.872740    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:41.872744    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:41.872747    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:42 GMT
	I0831 16:07:41.873113    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1106","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3890 chars]
	I0831 16:07:42.372029    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:42.372055    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:42.372067    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:42.372073    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:42.374826    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:42.374843    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:42.374850    5342 round_trippers.go:580]     Audit-Id: a90ca6d5-8557-4dbf-b1a5-fa6877bb72d5
	I0831 16:07:42.374855    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:42.374860    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:42.374863    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:42.374868    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:42.374872    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:42 GMT
	I0831 16:07:42.375009    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1106","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1"
:{"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}}, [truncated 3890 chars]
	I0831 16:07:42.871333    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:42.871361    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:42.871372    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:42.871381    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:42.874194    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:42.874210    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:42.874217    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:42.874222    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:42.874225    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:42.874231    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:42.874244    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:43 GMT
	I0831 16:07:42.874247    5342 round_trippers.go:580]     Audit-Id: d931b232-15e5-47f6-a2db-f179f83eb17a
	I0831 16:07:42.874410    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1118","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{
"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-att [truncated 3756 chars]
	I0831 16:07:42.874640    5342 node_ready.go:49] node "multinode-957000-m03" has status "Ready":"True"
	I0831 16:07:42.874651    5342 node_ready.go:38] duration metric: took 17.504894434s for node "multinode-957000-m03" to be "Ready" ...
	I0831 16:07:42.874659    5342 pod_ready.go:36] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 16:07:42.874713    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods
	I0831 16:07:42.874720    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:42.874728    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:42.874733    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:42.877018    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:42.877026    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:42.877034    5342 round_trippers.go:580]     Audit-Id: 2ba55391-7c82-45a0-bc3b-79e6d3c39caa
	I0831 16:07:42.877038    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:42.877042    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:42.877045    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:42.877050    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:42.877054    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:43 GMT
	I0831 16:07:42.877673    5342 request.go:1351] Response Body: {"kind":"PodList","apiVersion":"v1","metadata":{"resourceVersion":"1118"},"items":[{"metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"892","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"
f:preferredDuringSchedulingIgnoredDuringExecution":{}}},"f:containers": [truncated 88915 chars]
	I0831 16:07:42.879638    5342 pod_ready.go:79] waiting up to 6m0s for pod "coredns-6f6b679f8f-q4s6r" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:42.879677    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/coredns-6f6b679f8f-q4s6r
	I0831 16:07:42.879684    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:42.879691    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:42.879695    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:42.880748    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:07:42.880756    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:42.880761    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:42.880764    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:42.880768    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:42.880773    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:43 GMT
	I0831 16:07:42.880778    5342 round_trippers.go:580]     Audit-Id: b706d80f-c850-46fa-a015-15b9b7cec4c3
	I0831 16:07:42.880781    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:42.880912    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"coredns-6f6b679f8f-q4s6r","generateName":"coredns-6f6b679f8f-","namespace":"kube-system","uid":"b794efa0-8367-452b-90be-870e8d349f6f","resourceVersion":"892","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"k8s-app":"kube-dns","pod-template-hash":"6f6b679f8f"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"ReplicaSet","name":"coredns-6f6b679f8f","uid":"346c8b34-1a3d-446c-9c90-62b99db583c0","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:k8s-app":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"346c8b34-1a3d-446c-9c90-62b99db583c0\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:podAntiAffinity":{".":{},"f:preferredDuringSchedulingIgnoredDuringExecution":{
}}},"f:containers":{"k:{\"name\":\"coredns\"}":{".":{},"f:args":{},"f:i [truncated 7039 chars]
	I0831 16:07:42.881165    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:07:42.881173    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:42.881178    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:42.881181    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:42.882163    5342 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 16:07:42.882173    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:42.882186    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:43 GMT
	I0831 16:07:42.882195    5342 round_trippers.go:580]     Audit-Id: 880688aa-5c87-484b-aac3-8024a2a31ed2
	I0831 16:07:42.882207    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:42.882213    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:42.882217    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:42.882230    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:42.882449    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:07:42.882620    5342 pod_ready.go:93] pod "coredns-6f6b679f8f-q4s6r" in "kube-system" namespace has status "Ready":"True"
	I0831 16:07:42.882627    5342 pod_ready.go:82] duration metric: took 2.979299ms for pod "coredns-6f6b679f8f-q4s6r" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:42.882632    5342 pod_ready.go:79] waiting up to 6m0s for pod "etcd-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:42.882662    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/etcd-multinode-957000
	I0831 16:07:42.882667    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:42.882672    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:42.882675    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:42.883659    5342 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 16:07:42.883667    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:42.883672    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:42.883675    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:42.883679    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:42.883682    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:42.883684    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:43 GMT
	I0831 16:07:42.883687    5342 round_trippers.go:580]     Audit-Id: 242ffd78-ecc5-4222-8796-fa142e401040
	I0831 16:07:42.883772    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"etcd-multinode-957000","namespace":"kube-system","uid":"b4833809-a14f-49f4-b877-9f7e4be0bd39","resourceVersion":"857","creationTimestamp":"2024-08-31T22:57:31Z","labels":{"component":"etcd","tier":"control-plane"},"annotations":{"kubeadm.kubernetes.io/etcd.advertise-client-urls":"https://192.169.0.13:2379","kubernetes.io/config.hash":"7ee006dc216d695a2fa4355a2abea57a","kubernetes.io/config.mirror":"7ee006dc216d695a2fa4355a2abea57a","kubernetes.io/config.seen":"2024-08-31T22:57:31.349647295Z","kubernetes.io/config.source":"file"},"ownerReferences":[{"apiVersion":"v1","kind":"Node","name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","controller":true}],"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:31Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:annotations":{".":{},"f:kubeadm.kubernetes.io/etcd.advertise-cl
ient-urls":{},"f:kubernetes.io/config.hash":{},"f:kubernetes.io/config. [truncated 6663 chars]
	I0831 16:07:42.883995    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:07:42.884002    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:42.884008    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:42.884013    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:42.885167    5342 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0831 16:07:42.885177    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:42.885182    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:42.885185    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:43 GMT
	I0831 16:07:42.885187    5342 round_trippers.go:580]     Audit-Id: 62665a35-ae72-4ff8-8216-d7962f22d581
	I0831 16:07:42.885190    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:42.885192    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:42.885194    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:42.885338    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:07:42.885511    5342 pod_ready.go:93] pod "etcd-multinode-957000" in "kube-system" namespace has status "Ready":"True"
	I0831 16:07:42.885519    5342 pod_ready.go:82] duration metric: took 2.882037ms for pod "etcd-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:42.885538    5342 pod_ready.go:79] waiting up to 6m0s for pod "kube-apiserver-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:42.885588    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-multinode-957000
	I0831 16:07:42.885593    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:42.885598    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:42.885601    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:42.886561    5342 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 16:07:42.886570    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:42.886574    5342 round_trippers.go:580]     Audit-Id: d85ee9ab-63b2-48b5-9af0-9fcf311c92ee
	I0831 16:07:42.886577    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:42.886581    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:42.886583    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:42.886585    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:42.886587    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:43 GMT
	I0831 16:07:42.886697    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-apiserver-multinode-957000","namespace":"kube-system","uid":"e549c883-0eb6-43a1-be40-c8d2f3a9468e","resourceVersion":"862","creationTimestamp":"2024-08-31T22:57:31Z","labels":{"component":"kube-apiserver","tier":"control-plane"},"annotations":{"kubeadm.kubernetes.io/kube-apiserver.advertise-address.endpoint":"192.169.0.13:8443","kubernetes.io/config.hash":"5db461e18c39888a5ab16fd535bfcb2e","kubernetes.io/config.mirror":"5db461e18c39888a5ab16fd535bfcb2e","kubernetes.io/config.seen":"2024-08-31T22:57:31.349647948Z","kubernetes.io/config.source":"file"},"ownerReferences":[{"apiVersion":"v1","kind":"Node","name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","controller":true}],"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:31Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:annotations":{".":{},"f:kubeadm.kube
rnetes.io/kube-apiserver.advertise-address.endpoint":{},"f:kubernetes.i [truncated 7891 chars]
	I0831 16:07:42.886929    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:07:42.886937    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:42.886945    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:42.886949    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:42.887918    5342 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 16:07:42.887927    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:42.887932    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:43 GMT
	I0831 16:07:42.887936    5342 round_trippers.go:580]     Audit-Id: d8c7ee9b-8c16-43f4-a1bb-f08e586b534e
	I0831 16:07:42.887939    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:42.887942    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:42.887944    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:42.887948    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:42.888095    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:07:42.888264    5342 pod_ready.go:93] pod "kube-apiserver-multinode-957000" in "kube-system" namespace has status "Ready":"True"
	I0831 16:07:42.888271    5342 pod_ready.go:82] duration metric: took 2.726394ms for pod "kube-apiserver-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:42.888276    5342 pod_ready.go:79] waiting up to 6m0s for pod "kube-controller-manager-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:42.888304    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-multinode-957000
	I0831 16:07:42.888309    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:42.888314    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:42.888319    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:42.889276    5342 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 16:07:42.889283    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:42.889287    5342 round_trippers.go:580]     Audit-Id: ea521c4b-fa66-4d32-8270-7e26ed1adb65
	I0831 16:07:42.889291    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:42.889293    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:42.889296    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:42.889299    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:42.889302    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:43 GMT
	I0831 16:07:42.889479    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-controller-manager-multinode-957000","namespace":"kube-system","uid":"8a82b721-75a3-4460-b9eb-bfc4db35f20e","resourceVersion":"859","creationTimestamp":"2024-08-31T22:57:31Z","labels":{"component":"kube-controller-manager","tier":"control-plane"},"annotations":{"kubernetes.io/config.hash":"9edb08d8378ca77b90e86ed290d828c5","kubernetes.io/config.mirror":"9edb08d8378ca77b90e86ed290d828c5","kubernetes.io/config.seen":"2024-08-31T22:57:31.349643093Z","kubernetes.io/config.source":"file"},"ownerReferences":[{"apiVersion":"v1","kind":"Node","name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","controller":true}],"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:31Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:annotations":{".":{},"f:kubernetes.io/config.hash":{},"f:kubernetes.io/config.mirror":{},"f:kubernetes.i
o/config.seen":{},"f:kubernetes.io/config.source":{}},"f:labels":{".":{ [truncated 7464 chars]
	I0831 16:07:42.889729    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:07:42.889736    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:42.889742    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:42.889746    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:42.890675    5342 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0831 16:07:42.890686    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:42.890693    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:43 GMT
	I0831 16:07:42.890699    5342 round_trippers.go:580]     Audit-Id: 45fd1247-f2a4-423d-b954-694a4a7da4e8
	I0831 16:07:42.890701    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:42.890704    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:42.890706    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:42.890709    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:42.890806    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:07:42.890973    5342 pod_ready.go:93] pod "kube-controller-manager-multinode-957000" in "kube-system" namespace has status "Ready":"True"
	I0831 16:07:42.890981    5342 pod_ready.go:82] duration metric: took 2.700532ms for pod "kube-controller-manager-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:42.890988    5342 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-cplv4" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:43.072928    5342 request.go:632] Waited for 181.878881ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cplv4
	I0831 16:07:43.073048    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cplv4
	I0831 16:07:43.073060    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:43.073068    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:43.073075    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:43.075785    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:43.075803    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:43.075813    5342 round_trippers.go:580]     Audit-Id: 8952d48c-dd27-4d4b-82e7-26056e52f532
	I0831 16:07:43.075818    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:43.075823    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:43.075828    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:43.075832    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:43.075842    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:43 GMT
	I0831 16:07:43.076094    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-proxy-cplv4","generateName":"kube-proxy-","namespace":"kube-system","uid":"56ad32e2-f2ba-4fa5-b093-790a5205b4f2","resourceVersion":"1002","creationTimestamp":"2024-08-31T22:58:18Z","labels":{"controller-revision-hash":"5976bc5f75","k8s-app":"kube-proxy","pod-template-generation":"1"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"DaemonSet","name":"kube-proxy","uid":"7b2d5815-fd80-401f-9040-ee043a6144ec","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:58:18Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:controller-revision-hash":{},"f:k8s-app":{},"f:pod-template-generation":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"7b2d5815-fd80-401f-9040-ee043a6144ec\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:nodeAffinity":{".":{},"f:
requiredDuringSchedulingIgnoredDuringExecution":{}}},"f:containers":{"k [truncated 6198 chars]
	I0831 16:07:43.272366    5342 request.go:632] Waited for 195.929522ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:07:43.272416    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m02
	I0831 16:07:43.272425    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:43.272446    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:43.272455    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:43.275238    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:43.275256    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:43.275267    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:43.275273    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:43.275278    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:43 GMT
	I0831 16:07:43.275281    5342 round_trippers.go:580]     Audit-Id: 81b8ef59-ee27-4d16-97fe-ed4a0606bdac
	I0831 16:07:43.275284    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:43.275288    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:43.275359    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m02","uid":"1af472d6-7762-4200-bead-f029dcae1b9b","resourceVersion":"1026","creationTimestamp":"2024-08-31T23:06:48Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m02","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_06_49_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:06:48Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{
"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-att [truncated 3805 chars]
	I0831 16:07:43.275581    5342 pod_ready.go:93] pod "kube-proxy-cplv4" in "kube-system" namespace has status "Ready":"True"
	I0831 16:07:43.275592    5342 pod_ready.go:82] duration metric: took 384.595555ms for pod "kube-proxy-cplv4" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:43.275600    5342 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-ndfs6" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:43.471919    5342 request.go:632] Waited for 196.271237ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-ndfs6
	I0831 16:07:43.472030    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-ndfs6
	I0831 16:07:43.472041    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:43.472052    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:43.472064    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:43.474774    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:43.474790    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:43.474797    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:43.474834    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:43.474842    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:43 GMT
	I0831 16:07:43.474846    5342 round_trippers.go:580]     Audit-Id: 743c0d5d-a655-49d7-bf86-4e88bfc72a05
	I0831 16:07:43.474850    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:43.474853    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:43.474932    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-proxy-ndfs6","generateName":"kube-proxy-","namespace":"kube-system","uid":"34c16419-4c10-41bd-9446-75ba130cbe63","resourceVersion":"1097","creationTimestamp":"2024-08-31T22:59:10Z","labels":{"controller-revision-hash":"5976bc5f75","k8s-app":"kube-proxy","pod-template-generation":"1"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"DaemonSet","name":"kube-proxy","uid":"7b2d5815-fd80-401f-9040-ee043a6144ec","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:59:10Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:controller-revision-hash":{},"f:k8s-app":{},"f:pod-template-generation":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"7b2d5815-fd80-401f-9040-ee043a6144ec\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:nodeAffinity":{".":{},"f:
requiredDuringSchedulingIgnoredDuringExecution":{}}},"f:containers":{"k [truncated 6198 chars]
	I0831 16:07:43.673165    5342 request.go:632] Waited for 197.863612ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:43.673292    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000-m03
	I0831 16:07:43.673303    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:43.673312    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:43.673329    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:43.675952    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:43.675967    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:43.675974    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:43.675979    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:43.675983    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:43.675987    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:43 GMT
	I0831 16:07:43.675991    5342 round_trippers.go:580]     Audit-Id: 317828d2-76e1-47f5-b42a-908052c1406f
	I0831 16:07:43.675994    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:43.676077    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000-m03","uid":"61167462-c773-42ad-a2c9-140a1edd7a31","resourceVersion":"1118","creationTimestamp":"2024-08-31T23:07:24Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000-m03","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"false","minikube.k8s.io/updated_at":"2024_08_31T16_07_25_0700","minikube.k8s.io/version":"v1.33.1"},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T23:07:24Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{
"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-att [truncated 3756 chars]
	I0831 16:07:43.676301    5342 pod_ready.go:93] pod "kube-proxy-ndfs6" in "kube-system" namespace has status "Ready":"True"
	I0831 16:07:43.676312    5342 pod_ready.go:82] duration metric: took 400.702324ms for pod "kube-proxy-ndfs6" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:43.676320    5342 pod_ready.go:79] waiting up to 6m0s for pod "kube-proxy-zf7j6" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:43.873421    5342 request.go:632] Waited for 197.041694ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zf7j6
	I0831 16:07:43.873564    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zf7j6
	I0831 16:07:43.873576    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:43.873587    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:43.873595    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:43.876262    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:43.876278    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:43.876285    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:43.876291    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:43.876294    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:44 GMT
	I0831 16:07:43.876298    5342 round_trippers.go:580]     Audit-Id: 4d478cf7-7f07-46ec-9e4b-d28a82d834de
	I0831 16:07:43.876302    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:43.876330    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:43.876549    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-proxy-zf7j6","generateName":"kube-proxy-","namespace":"kube-system","uid":"e84c5d55-f27d-4d2a-9b41-6f1e6100ad2e","resourceVersion":"756","creationTimestamp":"2024-08-31T22:57:36Z","labels":{"controller-revision-hash":"5976bc5f75","k8s-app":"kube-proxy","pod-template-generation":"1"},"ownerReferences":[{"apiVersion":"apps/v1","kind":"DaemonSet","name":"kube-proxy","uid":"7b2d5815-fd80-401f-9040-ee043a6144ec","controller":true,"blockOwnerDeletion":true}],"managedFields":[{"manager":"kube-controller-manager","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:36Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:controller-revision-hash":{},"f:k8s-app":{},"f:pod-template-generation":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"7b2d5815-fd80-401f-9040-ee043a6144ec\"}":{}}},"f:spec":{"f:affinity":{".":{},"f:nodeAffinity":{".":{},"f:r
equiredDuringSchedulingIgnoredDuringExecution":{}}},"f:containers":{"k: [truncated 6394 chars]
	I0831 16:07:44.073423    5342 request.go:632] Waited for 196.500693ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:07:44.073568    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:07:44.073580    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:44.073593    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:44.073600    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:44.076168    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:44.076184    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:44.076191    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:44.076197    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:44 GMT
	I0831 16:07:44.076200    5342 round_trippers.go:580]     Audit-Id: 845dcdf3-f02f-4cc4-b2d1-3c5d3b5509d0
	I0831 16:07:44.076203    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:44.076207    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:44.076211    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:44.076539    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:07:44.076800    5342 pod_ready.go:93] pod "kube-proxy-zf7j6" in "kube-system" namespace has status "Ready":"True"
	I0831 16:07:44.076811    5342 pod_ready.go:82] duration metric: took 400.483034ms for pod "kube-proxy-zf7j6" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:44.076820    5342 pod_ready.go:79] waiting up to 6m0s for pod "kube-scheduler-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:44.271662    5342 request.go:632] Waited for 194.780323ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-multinode-957000
	I0831 16:07:44.271717    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-multinode-957000
	I0831 16:07:44.271725    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:44.271737    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:44.271744    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:44.274121    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:44.274136    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:44.274143    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:44 GMT
	I0831 16:07:44.274148    5342 round_trippers.go:580]     Audit-Id: ef0ea855-5745-4734-9aec-8c8ac6bf7efd
	I0831 16:07:44.274152    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:44.274158    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:44.274162    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:44.274169    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:44.274273    5342 request.go:1351] Response Body: {"kind":"Pod","apiVersion":"v1","metadata":{"name":"kube-scheduler-multinode-957000","namespace":"kube-system","uid":"f48d9647-8460-48da-a5b0-fc471f5536ad","resourceVersion":"847","creationTimestamp":"2024-08-31T22:57:31Z","labels":{"component":"kube-scheduler","tier":"control-plane"},"annotations":{"kubernetes.io/config.hash":"b74e8393ad84ccbcf23f7560eda422b0","kubernetes.io/config.mirror":"b74e8393ad84ccbcf23f7560eda422b0","kubernetes.io/config.seen":"2024-08-31T22:57:31.349646560Z","kubernetes.io/config.source":"file"},"ownerReferences":[{"apiVersion":"v1","kind":"Node","name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","controller":true}],"managedFields":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time":"2024-08-31T22:57:31Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:annotations":{".":{},"f:kubernetes.io/config.hash":{},"f:kubernetes.io/config.mirror":{},"f:kubernetes.io/config.seen":{},
"f:kubernetes.io/config.source":{}},"f:labels":{".":{},"f:component":{} [truncated 5194 chars]
	I0831 16:07:44.473406    5342 request.go:632] Waited for 198.807942ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:07:44.473539    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes/multinode-957000
	I0831 16:07:44.473548    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:44.473559    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:44.473568    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:44.476345    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:44.476364    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:44.476372    5342 round_trippers.go:580]     Audit-Id: 1c3b39a5-0866-431c-9c07-24bf630a3d4c
	I0831 16:07:44.476377    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:44.476381    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:44.476385    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:44.476388    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:44.476392    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:44 GMT
	I0831 16:07:44.476542    5342 request.go:1351] Response Body: {"kind":"Node","apiVersion":"v1","metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFields":[{"manager":"kubelet","operation":"Update","api
Version":"v1","time":"2024-08-31T22:57:28Z","fieldsType":"FieldsV1","fi [truncated 5165 chars]
	I0831 16:07:44.476803    5342 pod_ready.go:93] pod "kube-scheduler-multinode-957000" in "kube-system" namespace has status "Ready":"True"
	I0831 16:07:44.476815    5342 pod_ready.go:82] duration metric: took 399.985274ms for pod "kube-scheduler-multinode-957000" in "kube-system" namespace to be "Ready" ...
	I0831 16:07:44.476827    5342 pod_ready.go:39] duration metric: took 1.602150424s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0831 16:07:44.476842    5342 system_svc.go:44] waiting for kubelet service to be running ....
	I0831 16:07:44.476901    5342 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 16:07:44.488487    5342 system_svc.go:56] duration metric: took 11.641943ms WaitForService to wait for kubelet
	I0831 16:07:44.488504    5342 kubeadm.go:582] duration metric: took 19.3285s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0831 16:07:44.488521    5342 node_conditions.go:102] verifying NodePressure condition ...
	I0831 16:07:44.672343    5342 request.go:632] Waited for 183.717578ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.13:8443/api/v1/nodes
	I0831 16:07:44.672389    5342 round_trippers.go:463] GET https://192.169.0.13:8443/api/v1/nodes
	I0831 16:07:44.672425    5342 round_trippers.go:469] Request Headers:
	I0831 16:07:44.672438    5342 round_trippers.go:473]     Accept: application/json, */*
	I0831 16:07:44.672478    5342 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0831 16:07:44.675084    5342 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0831 16:07:44.675100    5342 round_trippers.go:577] Response Headers:
	I0831 16:07:44.675107    5342 round_trippers.go:580]     Date: Sat, 31 Aug 2024 23:07:44 GMT
	I0831 16:07:44.675112    5342 round_trippers.go:580]     Audit-Id: 01d8fe87-9775-4cd3-88cc-ba4d7b8139b4
	I0831 16:07:44.675117    5342 round_trippers.go:580]     Cache-Control: no-cache, private
	I0831 16:07:44.675121    5342 round_trippers.go:580]     Content-Type: application/json
	I0831 16:07:44.675124    5342 round_trippers.go:580]     X-Kubernetes-Pf-Flowschema-Uid: e859dc41-9459-4d38-a411-7f899b419805
	I0831 16:07:44.675127    5342 round_trippers.go:580]     X-Kubernetes-Pf-Prioritylevel-Uid: e5508e4b-a971-4549-880c-2b3187e3c687
	I0831 16:07:44.675523    5342 request.go:1351] Response Body: {"kind":"NodeList","apiVersion":"v1","metadata":{"resourceVersion":"1120"},"items":[{"metadata":{"name":"multinode-957000","uid":"7637fca8-40c4-4b6b-b551-048b24ff0707","resourceVersion":"870","creationTimestamp":"2024-08-31T22:57:29Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/arch":"amd64","kubernetes.io/hostname":"multinode-957000","kubernetes.io/os":"linux","minikube.k8s.io/commit":"8ab9a20c866aaad18bea6fac47c5d146303457d2","minikube.k8s.io/name":"multinode-957000","minikube.k8s.io/primary":"true","minikube.k8s.io/updated_at":"2024_08_31T15_57_32_0700","minikube.k8s.io/version":"v1.33.1","node-role.kubernetes.io/control-plane":"","node.kubernetes.io/exclude-from-external-load-balancers":""},"annotations":{"kubeadm.alpha.kubernetes.io/cri-socket":"unix:///var/run/cri-dockerd.sock","node.alpha.kubernetes.io/ttl":"0","volumes.kubernetes.io/controller-managed-attach-detach":"true"},"managedFie
lds":[{"manager":"kubelet","operation":"Update","apiVersion":"v1","time [truncated 14644 chars]
	I0831 16:07:44.676060    5342 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 16:07:44.676076    5342 node_conditions.go:123] node cpu capacity is 2
	I0831 16:07:44.676086    5342 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 16:07:44.676090    5342 node_conditions.go:123] node cpu capacity is 2
	I0831 16:07:44.676094    5342 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0831 16:07:44.676099    5342 node_conditions.go:123] node cpu capacity is 2
	I0831 16:07:44.676103    5342 node_conditions.go:105] duration metric: took 187.577266ms to run NodePressure ...
	I0831 16:07:44.676113    5342 start.go:241] waiting for startup goroutines ...
	I0831 16:07:44.676133    5342 start.go:255] writing updated cluster config ...
	I0831 16:07:44.676543    5342 ssh_runner.go:195] Run: rm -f paused
	I0831 16:07:44.717162    5342 start.go:600] kubectl: 1.29.2, cluster: 1.31.0 (minor skew: 2)
	I0831 16:07:44.738597    5342 out.go:201] 
	W0831 16:07:44.759979    5342 out.go:270] ! /usr/local/bin/kubectl is version 1.29.2, which may have incompatibilities with Kubernetes 1.31.0.
	I0831 16:07:44.781832    5342 out.go:177]   - Want kubectl v1.31.0? Try 'minikube kubectl -- get pods -A'
	I0831 16:07:44.860851    5342 out.go:177] * Done! kubectl is now configured to use "multinode-957000" cluster and "default" namespace by default
	
	
	==> Docker <==
	Aug 31 23:05:58 multinode-957000 dockerd[847]: time="2024-08-31T23:05:58.155469498Z" level=info msg="shim disconnected" id=10dcf0ab9505d4a779752a1521968e0396d569b4d15a732ec4b5ab45ace0c6d6 namespace=moby
	Aug 31 23:05:58 multinode-957000 dockerd[847]: time="2024-08-31T23:05:58.155510830Z" level=warning msg="cleaning up after shim disconnected" id=10dcf0ab9505d4a779752a1521968e0396d569b4d15a732ec4b5ab45ace0c6d6 namespace=moby
	Aug 31 23:05:58 multinode-957000 dockerd[847]: time="2024-08-31T23:05:58.155517945Z" level=info msg="cleaning up dead shim" namespace=moby
	Aug 31 23:05:59 multinode-957000 dockerd[847]: time="2024-08-31T23:05:59.467835553Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 23:05:59 multinode-957000 dockerd[847]: time="2024-08-31T23:05:59.468315035Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 23:05:59 multinode-957000 dockerd[847]: time="2024-08-31T23:05:59.469776275Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 23:05:59 multinode-957000 dockerd[847]: time="2024-08-31T23:05:59.470045407Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 23:05:59 multinode-957000 dockerd[847]: time="2024-08-31T23:05:59.600830403Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 23:05:59 multinode-957000 dockerd[847]: time="2024-08-31T23:05:59.600884649Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 23:05:59 multinode-957000 dockerd[847]: time="2024-08-31T23:05:59.600893569Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 23:05:59 multinode-957000 dockerd[847]: time="2024-08-31T23:05:59.600949385Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 23:05:59 multinode-957000 cri-dockerd[1094]: time="2024-08-31T23:05:59Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/6b36753c0f95db4d3aed451656a171810168b05b95c7c38bec614bf327435e3e/resolv.conf as [nameserver 10.96.0.10 search default.svc.cluster.local svc.cluster.local cluster.local options ndots:5]"
	Aug 31 23:05:59 multinode-957000 dockerd[847]: time="2024-08-31T23:05:59.699105864Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 23:05:59 multinode-957000 dockerd[847]: time="2024-08-31T23:05:59.699407277Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 23:05:59 multinode-957000 dockerd[847]: time="2024-08-31T23:05:59.699447651Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 23:05:59 multinode-957000 dockerd[847]: time="2024-08-31T23:05:59.699678307Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 23:05:59 multinode-957000 cri-dockerd[1094]: time="2024-08-31T23:05:59Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/84fa2766ab54163c5b311178db191ce0c7bc3bbfa6ebc9d91bd07fee43b025dc/resolv.conf as [nameserver 192.169.0.1]"
	Aug 31 23:05:59 multinode-957000 dockerd[847]: time="2024-08-31T23:05:59.802410420Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 23:05:59 multinode-957000 dockerd[847]: time="2024-08-31T23:05:59.802479206Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 23:05:59 multinode-957000 dockerd[847]: time="2024-08-31T23:05:59.802492054Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 23:05:59 multinode-957000 dockerd[847]: time="2024-08-31T23:05:59.802721727Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 23:06:13 multinode-957000 dockerd[847]: time="2024-08-31T23:06:13.568336976Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Aug 31 23:06:13 multinode-957000 dockerd[847]: time="2024-08-31T23:06:13.568781914Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Aug 31 23:06:13 multinode-957000 dockerd[847]: time="2024-08-31T23:06:13.568818949Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Aug 31 23:06:13 multinode-957000 dockerd[847]: time="2024-08-31T23:06:13.569424364Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                 CREATED              STATE               NAME                      ATTEMPT             POD ID              POD
	67e680dff04fc       6e38f40d628db                                                                                         About a minute ago   Running             storage-provisioner       2                   d18852c1a5bd9       storage-provisioner
	e0de7eea0368b       cbb01a7bd410d                                                                                         About a minute ago   Running             coredns                   1                   84fa2766ab541       coredns-6f6b679f8f-q4s6r
	c5b025807af56       8c811b4aec35f                                                                                         About a minute ago   Running             busybox                   1                   6b36753c0f95d       busybox-7dff88458-9qs4p
	74e19a060b7b7       12968670680f4                                                                                         2 minutes ago        Running             kindnet-cni               1                   1892561fbc3fe       kindnet-5vc9x
	10dcf0ab9505d       6e38f40d628db                                                                                         2 minutes ago        Exited              storage-provisioner       1                   d18852c1a5bd9       storage-provisioner
	3ab2e86b804e5       ad83b2ca7b09e                                                                                         2 minutes ago        Running             kube-proxy                1                   074db7937268e       kube-proxy-zf7j6
	c460390b7cdd7       2e96e5913fc06                                                                                         2 minutes ago        Running             etcd                      1                   291d6a1340dcc       etcd-multinode-957000
	45c71e6d7b937       604f5db92eaa8                                                                                         2 minutes ago        Running             kube-apiserver            1                   ad187080960f0       kube-apiserver-multinode-957000
	8e2baadfe0915       045733566833c                                                                                         2 minutes ago        Running             kube-controller-manager   1                   a4515b87626ea       kube-controller-manager-multinode-957000
	a89caa1cde06c       1766f54c897f0                                                                                         2 minutes ago        Running             kube-scheduler            1                   a05516a3cd8f2       kube-scheduler-multinode-957000
	a605c1b684f62       gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12   9 minutes ago        Exited              busybox                   0                   3df3b183e156e       busybox-7dff88458-9qs4p
	643a3abbab488       cbb01a7bd410d                                                                                         9 minutes ago        Exited              coredns                   0                   634429fa66e1b       coredns-6f6b679f8f-q4s6r
	5960dead3edc5       kindest/kindnetd@sha256:e59a687ca28ae274a2fc92f1e2f5f1c739f353178a43a23aafc71adb802ed166              10 minutes ago       Exited              kindnet-cni               0                   ac20eb760f627       kindnet-5vc9x
	d6ba988e63699       ad83b2ca7b09e                                                                                         10 minutes ago       Exited              kube-proxy                0                   9469c6604c289       kube-proxy-zf7j6
	47934ef0bc6f6       1766f54c897f0                                                                                         10 minutes ago       Exited              kube-scheduler            0                   d1171a7cb88a8       kube-scheduler-multinode-957000
	52037bd64f52b       604f5db92eaa8                                                                                         10 minutes ago       Exited              kube-apiserver            0                   039c066f54893       kube-apiserver-multinode-957000
	6a2eb4fcc96cc       2e96e5913fc06                                                                                         10 minutes ago       Exited              etcd                      0                   e020d44ad2a06       etcd-multinode-957000
	b244e0b6607cf       045733566833c                                                                                         10 minutes ago       Exited              kube-controller-manager   0                   eacabe17d95ab       kube-controller-manager-multinode-957000
	
	
	==> coredns [643a3abbab48] <==
	[INFO] 10.244.1.2:34988 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.00004314s
	[INFO] 10.244.1.2:43335 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000045654s
	[INFO] 10.244.1.2:56454 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000034662s
	[INFO] 10.244.1.2:42532 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000039829s
	[INFO] 10.244.1.2:39644 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000035498s
	[INFO] 10.244.1.2:42601 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000039199s
	[INFO] 10.244.1.2:41638 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000041122s
	[INFO] 10.244.0.3:58767 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000116526s
	[INFO] 10.244.0.3:52535 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000058523s
	[INFO] 10.244.0.3:39520 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000104736s
	[INFO] 10.244.0.3:55917 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000553459s
	[INFO] 10.244.1.2:57631 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000115515s
	[INFO] 10.244.1.2:53564 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000119211s
	[INFO] 10.244.1.2:60768 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000062016s
	[INFO] 10.244.1.2:55461 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000121714s
	[INFO] 10.244.0.3:60119 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000078949s
	[INFO] 10.244.0.3:34939 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000730043s
	[INFO] 10.244.0.3:50881 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000103877s
	[INFO] 10.244.0.3:36288 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000124806s
	[INFO] 10.244.1.2:44091 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000088972s
	[INFO] 10.244.1.2:41171 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000141529s
	[INFO] 10.244.1.2:42432 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000044472s
	[INFO] 10.244.1.2:44819 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000058025s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> coredns [e0de7eea0368] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 257e111468ef6f1e36f10df061303186c353cd0e51aed8f50f4e4fd21cec02687aef97084fe1f82262f5cee88179d311670a6ae21ae185759728216fc264125f
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:48644 - 14131 "HINFO IN 875711966665504665.2868258244392870741. udp 56 false 512" NXDOMAIN qr,rd,ra 131 0.318327402s
	
	
	==> describe nodes <==
	Name:               multinode-957000
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=multinode-957000
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=multinode-957000
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_08_31T15_57_32_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 22:57:29 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  multinode-957000
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 23:07:39 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 23:05:46 +0000   Sat, 31 Aug 2024 22:57:27 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 23:05:46 +0000   Sat, 31 Aug 2024 22:57:27 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 23:05:46 +0000   Sat, 31 Aug 2024 22:57:27 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 23:05:46 +0000   Sat, 31 Aug 2024 23:05:46 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.13
	  Hostname:    multinode-957000
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 1740a45ffcd14976b41b882d111d9288
	  System UUID:                0c4b4524-0000-0000-9ddd-b85a2c6eb027
	  Boot ID:                    a04f438d-6eba-46eb-89e4-1b000796bf7f
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (9 in total)
	  Namespace                   Name                                        CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                        ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-9qs4p                     0 (0%)        0 (0%)      0 (0%)           0 (0%)         9m3s
	  kube-system                 coredns-6f6b679f8f-q4s6r                    100m (5%)     0 (0%)      70Mi (3%)        170Mi (8%)     10m
	  kube-system                 etcd-multinode-957000                       100m (5%)     0 (0%)      100Mi (4%)       0 (0%)         10m
	  kube-system                 kindnet-5vc9x                               100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      10m
	  kube-system                 kube-apiserver-multinode-957000             250m (12%)    0 (0%)      0 (0%)           0 (0%)         10m
	  kube-system                 kube-controller-manager-multinode-957000    200m (10%)    0 (0%)      0 (0%)           0 (0%)         10m
	  kube-system                 kube-proxy-zf7j6                            0 (0%)        0 (0%)      0 (0%)           0 (0%)         10m
	  kube-system                 kube-scheduler-multinode-957000             100m (5%)     0 (0%)      0 (0%)           0 (0%)         10m
	  kube-system                 storage-provisioner                         0 (0%)        0 (0%)      0 (0%)           0 (0%)         10m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                850m (42%)   100m (5%)
	  memory             220Mi (10%)  220Mi (10%)
	  ephemeral-storage  0 (0%)       0 (0%)
	  hugepages-2Mi      0 (0%)       0 (0%)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 10m                    kube-proxy       
	  Normal  Starting                 2m18s                  kube-proxy       
	  Normal  NodeHasSufficientPID     10m                    kubelet          Node multinode-957000 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  10m                    kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  10m                    kubelet          Node multinode-957000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    10m                    kubelet          Node multinode-957000 status is now: NodeHasNoDiskPressure
	  Normal  Starting                 10m                    kubelet          Starting kubelet.
	  Normal  RegisteredNode           10m                    node-controller  Node multinode-957000 event: Registered Node multinode-957000 in Controller
	  Normal  NodeReady                9m51s                  kubelet          Node multinode-957000 status is now: NodeReady
	  Normal  Starting                 2m23s                  kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  2m23s (x8 over 2m23s)  kubelet          Node multinode-957000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    2m23s (x8 over 2m23s)  kubelet          Node multinode-957000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     2m23s (x7 over 2m23s)  kubelet          Node multinode-957000 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  2m23s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           2m17s                  node-controller  Node multinode-957000 event: Registered Node multinode-957000 in Controller
	
	
	Name:               multinode-957000-m02
	Roles:              <none>
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=multinode-957000-m02
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=multinode-957000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_08_31T16_06_49_0700
	                    minikube.k8s.io/version=v1.33.1
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 23:06:48 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  multinode-957000-m02
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 23:07:39 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 23:07:03 +0000   Sat, 31 Aug 2024 23:06:48 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 23:07:03 +0000   Sat, 31 Aug 2024 23:06:48 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 23:07:03 +0000   Sat, 31 Aug 2024 23:06:48 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 23:07:03 +0000   Sat, 31 Aug 2024 23:07:03 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.14
	  Hostname:    multinode-957000-m02
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 a1b48c47109045c28536f4a503e68942
	  System UUID:                26b642df-0000-0000-92e6-b799b636693f
	  Boot ID:                    c63a6dd7-4718-487b-80bb-6205b564ba2b
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (3 in total)
	  Namespace                   Name                       CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                       ------------  ----------  ---------------  -------------  ---
	  default                     busybox-7dff88458-lt6fd    0 (0%)        0 (0%)      0 (0%)           0 (0%)         62s
	  kube-system                 kindnet-gkhfh              100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      9m28s
	  kube-system                 kube-proxy-cplv4           0 (0%)        0 (0%)      0 (0%)           0 (0%)         9m28s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests   Limits
	  --------           --------   ------
	  cpu                100m (5%)  100m (5%)
	  memory             50Mi (2%)  50Mi (2%)
	  ephemeral-storage  0 (0%)     0 (0%)
	  hugepages-2Mi      0 (0%)     0 (0%)
	Events:
	  Type    Reason                   Age                    From        Message
	  ----    ------                   ----                   ----        -------
	  Normal  Starting                 9m21s                  kube-proxy  
	  Normal  Starting                 55s                    kube-proxy  
	  Normal  NodeHasSufficientMemory  9m28s (x2 over 9m28s)  kubelet     Node multinode-957000-m02 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    9m28s (x2 over 9m28s)  kubelet     Node multinode-957000-m02 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     9m28s (x2 over 9m28s)  kubelet     Node multinode-957000-m02 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  9m28s                  kubelet     Updated Node Allocatable limit across pods
	  Normal  NodeReady                9m5s                   kubelet     Node multinode-957000-m02 status is now: NodeReady
	  Normal  NodeHasSufficientMemory  58s (x2 over 58s)      kubelet     Node multinode-957000-m02 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    58s (x2 over 58s)      kubelet     Node multinode-957000-m02 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     58s (x2 over 58s)      kubelet     Node multinode-957000-m02 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  58s                    kubelet     Updated Node Allocatable limit across pods
	  Normal  NodeReady                43s                    kubelet     Node multinode-957000-m02 status is now: NodeReady
	
	
	Name:               multinode-957000-m03
	Roles:              <none>
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=multinode-957000-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=8ab9a20c866aaad18bea6fac47c5d146303457d2
	                    minikube.k8s.io/name=multinode-957000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_08_31T16_07_25_0700
	                    minikube.k8s.io/version=v1.33.1
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Sat, 31 Aug 2024 23:07:24 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  multinode-957000-m03
	  AcquireTime:     <unset>
	  RenewTime:       Sat, 31 Aug 2024 23:07:44 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Sat, 31 Aug 2024 23:07:42 +0000   Sat, 31 Aug 2024 23:07:24 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Sat, 31 Aug 2024 23:07:42 +0000   Sat, 31 Aug 2024 23:07:24 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Sat, 31 Aug 2024 23:07:42 +0000   Sat, 31 Aug 2024 23:07:24 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Sat, 31 Aug 2024 23:07:42 +0000   Sat, 31 Aug 2024 23:07:42 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.15
	  Hostname:    multinode-957000-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 213a8ffcfd814d68b401298b17a40a93
	  System UUID:                93064071-0000-0000-8737-9a8c8096e22e
	  Boot ID:                    58f83dff-b6be-4f66-a777-eb838119edbb
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.2.0
	  Kubelet Version:            v1.31.0
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.2.0/24
	PodCIDRs:                     10.244.2.0/24
	Non-terminated Pods:          (2 in total)
	  Namespace                   Name                CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                ------------  ----------  ---------------  -------------  ---
	  kube-system                 kindnet-cjqw5       100m (5%)     100m (5%)   50Mi (2%)        50Mi (2%)      8m36s
	  kube-system                 kube-proxy-ndfs6    0 (0%)        0 (0%)      0 (0%)           0 (0%)         8m36s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests   Limits
	  --------           --------   ------
	  cpu                100m (5%)  100m (5%)
	  memory             50Mi (2%)  50Mi (2%)
	  ephemeral-storage  0 (0%)     0 (0%)
	  hugepages-2Mi      0 (0%)     0 (0%)
	Events:
	  Type    Reason                   Age                    From        Message
	  ----    ------                   ----                   ----        -------
	  Normal  Starting                 8m29s                  kube-proxy  
	  Normal  Starting                 19s                    kube-proxy  
	  Normal  Starting                 7m39s                  kube-proxy  
	  Normal  NodeAllocatableEnforced  8m37s                  kubelet     Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  8m36s (x2 over 8m37s)  kubelet     Node multinode-957000-m03 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    8m36s (x2 over 8m37s)  kubelet     Node multinode-957000-m03 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     8m36s (x2 over 8m37s)  kubelet     Node multinode-957000-m03 status is now: NodeHasSufficientPID
	  Normal  NodeReady                8m14s                  kubelet     Node multinode-957000-m03 status is now: NodeReady
	  Normal  NodeHasSufficientPID     7m42s (x2 over 7m42s)  kubelet     Node multinode-957000-m03 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  7m42s                  kubelet     Updated Node Allocatable limit across pods
	  Normal  NodeHasNoDiskPressure    7m42s (x2 over 7m42s)  kubelet     Node multinode-957000-m03 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientMemory  7m42s (x2 over 7m42s)  kubelet     Node multinode-957000-m03 status is now: NodeHasSufficientMemory
	  Normal  NodeReady                7m23s                  kubelet     Node multinode-957000-m03 status is now: NodeReady
	  Normal  NodeHasSufficientMemory  22s (x2 over 22s)      kubelet     Node multinode-957000-m03 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    22s (x2 over 22s)      kubelet     Node multinode-957000-m03 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     22s (x2 over 22s)      kubelet     Node multinode-957000-m03 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  22s                    kubelet     Updated Node Allocatable limit across pods
	  Normal  NodeReady                4s                     kubelet     Node multinode-957000-m03 status is now: NodeReady
	
	
	==> dmesg <==
	[  +0.008159] RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible!
	[  +5.643212] ACPI Error: Could not enable RealTimeClock event (20200925/evxfevnt-182)
	[  +0.000002] ACPI Warning: Could not enable fixed event - RealTimeClock (4) (20200925/evxface-618)
	[  +0.007005] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.806217] systemd-fstab-generator[127]: Ignoring "noauto" option for root device
	[  +2.243987] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000008] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[Aug31 23:05] systemd-fstab-generator[487]: Ignoring "noauto" option for root device
	[  +0.094242] systemd-fstab-generator[499]: Ignoring "noauto" option for root device
	[  +1.815557] systemd-fstab-generator[769]: Ignoring "noauto" option for root device
	[  +0.246053] systemd-fstab-generator[806]: Ignoring "noauto" option for root device
	[  +0.115399] systemd-fstab-generator[818]: Ignoring "noauto" option for root device
	[  +0.107659] systemd-fstab-generator[832]: Ignoring "noauto" option for root device
	[  +2.441977] systemd-fstab-generator[1047]: Ignoring "noauto" option for root device
	[  +0.103451] systemd-fstab-generator[1059]: Ignoring "noauto" option for root device
	[  +0.097936] systemd-fstab-generator[1071]: Ignoring "noauto" option for root device
	[  +0.048259] kauditd_printk_skb: 239 callbacks suppressed
	[  +0.070883] systemd-fstab-generator[1086]: Ignoring "noauto" option for root device
	[  +0.417098] systemd-fstab-generator[1212]: Ignoring "noauto" option for root device
	[  +1.584779] systemd-fstab-generator[1343]: Ignoring "noauto" option for root device
	[  +4.615855] kauditd_printk_skb: 128 callbacks suppressed
	[  +2.880438] systemd-fstab-generator[2190]: Ignoring "noauto" option for root device
	[ +27.381404] kauditd_printk_skb: 72 callbacks suppressed
	[Aug31 23:06] kauditd_printk_skb: 15 callbacks suppressed
	
	
	==> etcd [6a2eb4fcc96c] <==
	{"level":"info","ts":"2024-08-31T22:57:27.657551Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: e0290fa3161c5471 elected leader e0290fa3161c5471 at term 2"}
	{"level":"info","ts":"2024-08-31T22:57:27.662478Z","caller":"etcdserver/server.go:2118","msg":"published local member to cluster through raft","local-member-id":"e0290fa3161c5471","local-member-attributes":"{Name:multinode-957000 ClientURLs:[https://192.169.0.13:2379]}","request-path":"/0/members/e0290fa3161c5471/attributes","cluster-id":"87b46e718846f146","publish-timeout":"7s"}
	{"level":"info","ts":"2024-08-31T22:57:27.662664Z","caller":"etcdserver/server.go:2629","msg":"setting up initial cluster version using v2 API","cluster-version":"3.5"}
	{"level":"info","ts":"2024-08-31T22:57:27.665518Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-08-31T22:57:27.665756Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-08-31T22:57:27.665978Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2024-08-31T22:57:27.666074Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2024-08-31T22:57:27.668074Z","caller":"v3rpc/health.go:61","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2024-08-31T22:57:27.672011Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.169.0.13:2379"}
	{"level":"info","ts":"2024-08-31T22:57:27.672157Z","caller":"v3rpc/health.go:61","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2024-08-31T22:57:27.672982Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2024-08-31T22:57:27.676751Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"87b46e718846f146","local-member-id":"e0290fa3161c5471","cluster-version":"3.5"}
	{"level":"info","ts":"2024-08-31T22:57:27.676812Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2024-08-31T22:57:27.676909Z","caller":"etcdserver/server.go:2653","msg":"cluster version is updated","cluster-version":"3.5"}
	{"level":"info","ts":"2024-08-31T22:57:32.252164Z","caller":"traceutil/trace.go:171","msg":"trace[23570143] transaction","detail":"{read_only:false; response_revision:257; number_of_response:1; }","duration":"111.149692ms","start":"2024-08-31T22:57:32.141003Z","end":"2024-08-31T22:57:32.252153Z","steps":["trace[23570143] 'process raft request'  (duration: 29.768067ms)","trace[23570143] 'compare'  (duration: 81.325233ms)"],"step_count":2}
	{"level":"info","ts":"2024-08-31T23:00:36.404282Z","caller":"osutil/interrupt_unix.go:64","msg":"received signal; shutting down","signal":"terminated"}
	{"level":"info","ts":"2024-08-31T23:00:36.404338Z","caller":"embed/etcd.go:377","msg":"closing etcd server","name":"multinode-957000","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.169.0.13:2380"],"advertise-client-urls":["https://192.169.0.13:2379"]}
	{"level":"warn","ts":"2024-08-31T23:00:36.404393Z","caller":"embed/serve.go:212","msg":"stopping secure grpc server due to error","error":"accept tcp 127.0.0.1:2379: use of closed network connection"}
	{"level":"warn","ts":"2024-08-31T23:00:36.404466Z","caller":"embed/serve.go:214","msg":"stopped secure grpc server due to error","error":"accept tcp 127.0.0.1:2379: use of closed network connection"}
	{"level":"warn","ts":"2024-08-31T23:00:36.421063Z","caller":"embed/serve.go:212","msg":"stopping secure grpc server due to error","error":"accept tcp 192.169.0.13:2379: use of closed network connection"}
	{"level":"warn","ts":"2024-08-31T23:00:36.421087Z","caller":"embed/serve.go:214","msg":"stopped secure grpc server due to error","error":"accept tcp 192.169.0.13:2379: use of closed network connection"}
	{"level":"info","ts":"2024-08-31T23:00:36.421117Z","caller":"etcdserver/server.go:1521","msg":"skipped leadership transfer for single voting member cluster","local-member-id":"e0290fa3161c5471","current-leader-member-id":"e0290fa3161c5471"}
	{"level":"info","ts":"2024-08-31T23:00:36.422383Z","caller":"embed/etcd.go:581","msg":"stopping serving peer traffic","address":"192.169.0.13:2380"}
	{"level":"info","ts":"2024-08-31T23:00:36.422431Z","caller":"embed/etcd.go:586","msg":"stopped serving peer traffic","address":"192.169.0.13:2380"}
	{"level":"info","ts":"2024-08-31T23:00:36.422438Z","caller":"embed/etcd.go:379","msg":"closed etcd server","name":"multinode-957000","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.169.0.13:2380"],"advertise-client-urls":["https://192.169.0.13:2379"]}
	
	
	==> etcd [c460390b7cdd] <==
	{"level":"info","ts":"2024-08-31T23:05:24.520074Z","caller":"v3rpc/health.go:61","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2024-08-31T23:05:24.521341Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"87b46e718846f146","local-member-id":"e0290fa3161c5471","cluster-version":"3.5"}
	{"level":"info","ts":"2024-08-31T23:05:24.523396Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2024-08-31T23:05:24.524695Z","caller":"embed/etcd.go:728","msg":"starting with client TLS","tls-info":"cert = /var/lib/minikube/certs/etcd/server.crt, key = /var/lib/minikube/certs/etcd/server.key, client-cert=, client-key=, trusted-ca = /var/lib/minikube/certs/etcd/ca.crt, client-cert-auth = true, crl-file = ","cipher-suites":[]}
	{"level":"info","ts":"2024-08-31T23:05:24.525811Z","caller":"embed/etcd.go:279","msg":"now serving peer/client/metrics","local-member-id":"e0290fa3161c5471","initial-advertise-peer-urls":["https://192.169.0.13:2380"],"listen-peer-urls":["https://192.169.0.13:2380"],"advertise-client-urls":["https://192.169.0.13:2379"],"listen-client-urls":["https://127.0.0.1:2379","https://192.169.0.13:2379"],"listen-metrics-urls":["http://127.0.0.1:2381"]}
	{"level":"info","ts":"2024-08-31T23:05:24.525848Z","caller":"embed/etcd.go:870","msg":"serving metrics","address":"http://127.0.0.1:2381"}
	{"level":"info","ts":"2024-08-31T23:05:24.525996Z","caller":"embed/etcd.go:599","msg":"serving peer traffic","address":"192.169.0.13:2380"}
	{"level":"info","ts":"2024-08-31T23:05:24.526024Z","caller":"embed/etcd.go:571","msg":"cmux::serve","address":"192.169.0.13:2380"}
	{"level":"info","ts":"2024-08-31T23:05:24.527184Z","caller":"etcdserver/server.go:751","msg":"started as single-node; fast-forwarding election ticks","local-member-id":"e0290fa3161c5471","forward-ticks":9,"forward-duration":"900ms","election-ticks":10,"election-timeout":"1s"}
	{"level":"info","ts":"2024-08-31T23:05:25.293193Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"e0290fa3161c5471 is starting a new election at term 2"}
	{"level":"info","ts":"2024-08-31T23:05:25.293244Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"e0290fa3161c5471 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-08-31T23:05:25.293265Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"e0290fa3161c5471 received MsgPreVoteResp from e0290fa3161c5471 at term 2"}
	{"level":"info","ts":"2024-08-31T23:05:25.293287Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"e0290fa3161c5471 became candidate at term 3"}
	{"level":"info","ts":"2024-08-31T23:05:25.293293Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"e0290fa3161c5471 received MsgVoteResp from e0290fa3161c5471 at term 3"}
	{"level":"info","ts":"2024-08-31T23:05:25.293391Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"e0290fa3161c5471 became leader at term 3"}
	{"level":"info","ts":"2024-08-31T23:05:25.293423Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: e0290fa3161c5471 elected leader e0290fa3161c5471 at term 3"}
	{"level":"info","ts":"2024-08-31T23:05:25.294581Z","caller":"etcdserver/server.go:2118","msg":"published local member to cluster through raft","local-member-id":"e0290fa3161c5471","local-member-attributes":"{Name:multinode-957000 ClientURLs:[https://192.169.0.13:2379]}","request-path":"/0/members/e0290fa3161c5471/attributes","cluster-id":"87b46e718846f146","publish-timeout":"7s"}
	{"level":"info","ts":"2024-08-31T23:05:25.295681Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-08-31T23:05:25.296080Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-08-31T23:05:25.296870Z","caller":"v3rpc/health.go:61","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2024-08-31T23:05:25.297568Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2024-08-31T23:05:25.298908Z","caller":"v3rpc/health.go:61","msg":"grpc service status changed","service":"","status":"SERVING"}
	{"level":"info","ts":"2024-08-31T23:05:25.299577Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.169.0.13:2379"}
	{"level":"info","ts":"2024-08-31T23:05:25.300631Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2024-08-31T23:05:25.300663Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	
	
	==> kernel <==
	 23:07:47 up 3 min,  0 users,  load average: 0.09, 0.09, 0.03
	Linux multinode-957000 5.10.207 #1 SMP Wed Aug 28 20:54:17 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [5960dead3edc] <==
	I0831 23:00:01.478330       1 main.go:295] Handling node with IPs: map[192.169.0.13:{}]
	I0831 23:00:01.478352       1 main.go:299] handling current node
	I0831 23:00:01.478363       1 main.go:295] Handling node with IPs: map[192.169.0.14:{}]
	I0831 23:00:01.478366       1 main.go:322] Node multinode-957000-m02 has CIDR [10.244.1.0/24] 
	I0831 23:00:01.478444       1 main.go:295] Handling node with IPs: map[192.169.0.15:{}]
	I0831 23:00:01.478449       1 main.go:322] Node multinode-957000-m03 has CIDR [10.244.2.0/24] 
	I0831 23:00:11.469684       1 main.go:295] Handling node with IPs: map[192.169.0.13:{}]
	I0831 23:00:11.469755       1 main.go:299] handling current node
	I0831 23:00:11.469825       1 main.go:295] Handling node with IPs: map[192.169.0.14:{}]
	I0831 23:00:11.469881       1 main.go:322] Node multinode-957000-m02 has CIDR [10.244.1.0/24] 
	I0831 23:00:11.470014       1 main.go:295] Handling node with IPs: map[192.169.0.15:{}]
	I0831 23:00:11.470035       1 main.go:322] Node multinode-957000-m03 has CIDR [10.244.3.0/24] 
	I0831 23:00:11.470128       1 routes.go:62] Adding route {Ifindex: 0 Dst: 10.244.3.0/24 Src: <nil> Gw: 192.169.0.15 Flags: [] Table: 0} 
	I0831 23:00:21.474167       1 main.go:295] Handling node with IPs: map[192.169.0.15:{}]
	I0831 23:00:21.474416       1 main.go:322] Node multinode-957000-m03 has CIDR [10.244.3.0/24] 
	I0831 23:00:21.474668       1 main.go:295] Handling node with IPs: map[192.169.0.13:{}]
	I0831 23:00:21.474810       1 main.go:299] handling current node
	I0831 23:00:21.474842       1 main.go:295] Handling node with IPs: map[192.169.0.14:{}]
	I0831 23:00:21.474859       1 main.go:322] Node multinode-957000-m02 has CIDR [10.244.1.0/24] 
	I0831 23:00:31.472819       1 main.go:295] Handling node with IPs: map[192.169.0.13:{}]
	I0831 23:00:31.472886       1 main.go:299] handling current node
	I0831 23:00:31.472904       1 main.go:295] Handling node with IPs: map[192.169.0.14:{}]
	I0831 23:00:31.472913       1 main.go:322] Node multinode-957000-m02 has CIDR [10.244.1.0/24] 
	I0831 23:00:31.473129       1 main.go:295] Handling node with IPs: map[192.169.0.15:{}]
	I0831 23:00:31.473187       1 main.go:322] Node multinode-957000-m03 has CIDR [10.244.3.0/24] 
	
	
	==> kindnet [74e19a060b7b] <==
	I0831 23:07:09.213774       1 main.go:295] Handling node with IPs: map[192.169.0.13:{}]
	I0831 23:07:09.213960       1 main.go:299] handling current node
	I0831 23:07:09.214028       1 main.go:295] Handling node with IPs: map[192.169.0.14:{}]
	I0831 23:07:09.214075       1 main.go:322] Node multinode-957000-m02 has CIDR [10.244.1.0/24] 
	I0831 23:07:09.214235       1 main.go:295] Handling node with IPs: map[192.169.0.15:{}]
	I0831 23:07:09.214325       1 main.go:322] Node multinode-957000-m03 has CIDR [10.244.3.0/24] 
	I0831 23:07:19.211302       1 main.go:295] Handling node with IPs: map[192.169.0.15:{}]
	I0831 23:07:19.211475       1 main.go:322] Node multinode-957000-m03 has CIDR [10.244.3.0/24] 
	I0831 23:07:19.211879       1 main.go:295] Handling node with IPs: map[192.169.0.13:{}]
	I0831 23:07:19.212258       1 main.go:299] handling current node
	I0831 23:07:19.212562       1 main.go:295] Handling node with IPs: map[192.169.0.14:{}]
	I0831 23:07:19.212864       1 main.go:322] Node multinode-957000-m02 has CIDR [10.244.1.0/24] 
	I0831 23:07:29.212239       1 main.go:295] Handling node with IPs: map[192.169.0.13:{}]
	I0831 23:07:29.212578       1 main.go:299] handling current node
	I0831 23:07:29.212699       1 main.go:295] Handling node with IPs: map[192.169.0.14:{}]
	I0831 23:07:29.212777       1 main.go:322] Node multinode-957000-m02 has CIDR [10.244.1.0/24] 
	I0831 23:07:29.212942       1 main.go:295] Handling node with IPs: map[192.169.0.15:{}]
	I0831 23:07:29.213038       1 main.go:322] Node multinode-957000-m03 has CIDR [10.244.2.0/24] 
	I0831 23:07:29.213213       1 routes.go:62] Adding route {Ifindex: 0 Dst: 10.244.2.0/24 Src: <nil> Gw: 192.169.0.15 Flags: [] Table: 0} 
	I0831 23:07:39.211937       1 main.go:295] Handling node with IPs: map[192.169.0.13:{}]
	I0831 23:07:39.212064       1 main.go:299] handling current node
	I0831 23:07:39.212105       1 main.go:295] Handling node with IPs: map[192.169.0.14:{}]
	I0831 23:07:39.212131       1 main.go:322] Node multinode-957000-m02 has CIDR [10.244.1.0/24] 
	I0831 23:07:39.212282       1 main.go:295] Handling node with IPs: map[192.169.0.15:{}]
	I0831 23:07:39.212364       1 main.go:322] Node multinode-957000-m03 has CIDR [10.244.2.0/24] 
	
	
	==> kube-apiserver [45c71e6d7b93] <==
	I0831 23:05:26.464337       1 cache.go:39] Caches are synced for RemoteAvailability controller
	E0831 23:05:26.468401       1 controller.go:97] Error removing old endpoints from kubernetes service: no API server IP addresses were listed in storage, refusing to erase all endpoints for the kubernetes Service
	I0831 23:05:26.470594       1 handler_discovery.go:450] Starting ResourceDiscoveryManager
	I0831 23:05:26.473611       1 shared_informer.go:320] Caches are synced for configmaps
	I0831 23:05:26.473619       1 cache.go:39] Caches are synced for LocalAvailability controller
	I0831 23:05:26.475259       1 shared_informer.go:320] Caches are synced for cluster_authentication_trust_controller
	I0831 23:05:26.477231       1 apf_controller.go:382] Running API Priority and Fairness config worker
	I0831 23:05:26.477364       1 shared_informer.go:320] Caches are synced for crd-autoregister
	I0831 23:05:26.477575       1 aggregator.go:171] initial CRD sync complete...
	I0831 23:05:26.477694       1 autoregister_controller.go:144] Starting autoregister controller
	I0831 23:05:26.477856       1 cache.go:32] Waiting for caches to sync for autoregister controller
	I0831 23:05:26.477913       1 cache.go:39] Caches are synced for autoregister controller
	I0831 23:05:26.477411       1 apf_controller.go:385] Running API Priority and Fairness periodic rebalancing process
	I0831 23:05:26.481204       1 shared_informer.go:320] Caches are synced for node_authorizer
	I0831 23:05:26.495442       1 shared_informer.go:320] Caches are synced for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0831 23:05:26.495546       1 policy_source.go:224] refreshing policies
	I0831 23:05:26.506624       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0831 23:05:27.368431       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0831 23:05:28.626707       1 controller.go:615] quota admission added evaluator for: daemonsets.apps
	I0831 23:05:28.764783       1 controller.go:615] quota admission added evaluator for: serviceaccounts
	I0831 23:05:28.774264       1 controller.go:615] quota admission added evaluator for: deployments.apps
	I0831 23:05:28.836080       1 controller.go:615] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I0831 23:05:28.840744       1 controller.go:615] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0831 23:05:30.099168       1 controller.go:615] quota admission added evaluator for: endpoints
	I0831 23:05:30.149919       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	
	
	==> kube-apiserver [52037bd64f52] <==
	W0831 23:00:36.409923       1 logging.go:55] [core] [Channel #100 SubChannel #101]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.409940       1 logging.go:55] [core] [Channel #76 SubChannel #77]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.409963       1 logging.go:55] [core] [Channel #118 SubChannel #119]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.409982       1 logging.go:55] [core] [Channel #106 SubChannel #107]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.410000       1 logging.go:55] [core] [Channel #151 SubChannel #152]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.410019       1 logging.go:55] [core] [Channel #49 SubChannel #50]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.410046       1 logging.go:55] [core] [Channel #88 SubChannel #89]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.410095       1 logging.go:55] [core] [Channel #109 SubChannel #110]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.410115       1 logging.go:55] [core] [Channel #127 SubChannel #128]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.410146       1 logging.go:55] [core] [Channel #115 SubChannel #116]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.410155       1 logging.go:55] [core] [Channel #97 SubChannel #98]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.410180       1 logging.go:55] [core] [Channel #136 SubChannel #137]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.410199       1 logging.go:55] [core] [Channel #133 SubChannel #134]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.410230       1 logging.go:55] [core] [Channel #142 SubChannel #143]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.410249       1 logging.go:55] [core] [Channel #70 SubChannel #71]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.410268       1 logging.go:55] [core] [Channel #103 SubChannel #104]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.410290       1 logging.go:55] [core] [Channel #148 SubChannel #149]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.410308       1 logging.go:55] [core] [Channel #154 SubChannel #155]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.410341       1 logging.go:55] [core] [Channel #169 SubChannel #170]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.410361       1 logging.go:55] [core] [Channel #172 SubChannel #173]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.410382       1 logging.go:55] [core] [Channel #37 SubChannel #38]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.410401       1 logging.go:55] [core] [Channel #22 SubChannel #23]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.411842       1 logging.go:55] [core] [Channel #130 SubChannel #131]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	W0831 23:00:36.411880       1 logging.go:55] [core] [Channel #2 SubChannel #4]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused"
	I0831 23:00:36.433002       1 controller.go:128] Shutting down kubernetes service endpoint reconciler
	
	
	==> kube-controller-manager [8e2baadfe091] <==
	I0831 23:07:14.737922       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="43.319µs"
	I0831 23:07:14.739914       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="28.005µs"
	I0831 23:07:14.747869       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="38.43µs"
	I0831 23:07:14.906103       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="60.503µs"
	I0831 23:07:14.907658       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="31.71µs"
	I0831 23:07:15.921211       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="3.472273ms"
	I0831 23:07:15.921300       1 replica_set.go:679] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-7dff88458" duration="49.801µs"
	I0831 23:07:23.569823       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:07:23.582790       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:07:23.733558       1 topologycache.go:237] "Can't get CPU or zone information for node" logger="endpointslice-controller" node="multinode-957000-m02"
	I0831 23:07:23.733929       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:07:24.612941       1 topologycache.go:237] "Can't get CPU or zone information for node" logger="endpointslice-controller" node="multinode-957000-m02"
	I0831 23:07:24.613528       1 actual_state_of_world.go:540] "Failed to update statusUpdateNeeded field in actual state of world" logger="persistentvolume-attach-detach-controller" err="Failed to set statusUpdateNeeded to needed true, because nodeName=\"multinode-957000-m03\" does not exist"
	I0831 23:07:24.618163       1 range_allocator.go:422] "Set node PodCIDR" logger="node-ipam-controller" node="multinode-957000-m03" podCIDRs=["10.244.2.0/24"]
	I0831 23:07:24.618201       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:07:24.620439       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:07:24.623780       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:07:24.833679       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:07:25.066733       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:07:25.334973       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:07:34.614400       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:07:42.725012       1 topologycache.go:237] "Can't get CPU or zone information for node" logger="endpointslice-controller" node="multinode-957000-m02"
	I0831 23:07:42.725326       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:07:42.731019       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:07:44.794259       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	
	
	==> kube-controller-manager [b244e0b6607c] <==
	I0831 22:59:10.768785       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 22:59:20.473106       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 22:59:32.990970       1 topologycache.go:237] "Can't get CPU or zone information for node" logger="endpointslice-controller" node="multinode-957000-m03"
	I0831 22:59:32.991298       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 22:59:32.996819       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 22:59:35.741127       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 22:59:40.868779       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:00:03.721035       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:00:03.729818       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:00:03.928564       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:00:03.928640       1 topologycache.go:237] "Can't get CPU or zone information for node" logger="endpointslice-controller" node="multinode-957000-m02"
	I0831 23:00:04.750193       1 actual_state_of_world.go:540] "Failed to update statusUpdateNeeded field in actual state of world" logger="persistentvolume-attach-detach-controller" err="Failed to set statusUpdateNeeded to needed true, because nodeName=\"multinode-957000-m03\" does not exist"
	I0831 23:00:04.750341       1 topologycache.go:237] "Can't get CPU or zone information for node" logger="endpointslice-controller" node="multinode-957000-m02"
	I0831 23:00:04.764798       1 range_allocator.go:422] "Set node PodCIDR" logger="node-ipam-controller" node="multinode-957000-m03" podCIDRs=["10.244.3.0/24"]
	I0831 23:00:04.764837       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:00:04.764856       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:00:04.764951       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:00:04.770582       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:00:05.074707       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:00:05.770121       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:00:15.158834       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:00:23.099357       1 topologycache.go:237] "Can't get CPU or zone information for node" logger="endpointslice-controller" node="multinode-957000-m02"
	I0831 23:00:23.099744       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:00:23.105294       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	I0831 23:00:25.756215       1 range_allocator.go:241] "Successfully synced" logger="node-ipam-controller" key="multinode-957000-m03"
	
	
	==> kube-proxy [3ab2e86b804e] <==
		add table ip kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	E0831 23:05:28.364017       1 proxier.go:734] "Error cleaning up nftables rules" err=<
		could not run nftables command: /dev/stdin:1:1-25: Error: Could not process rule: Operation not supported
		add table ip6 kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	I0831 23:05:28.379868       1 server.go:677] "Successfully retrieved node IP(s)" IPs=["192.169.0.13"]
	E0831 23:05:28.379947       1 server.go:234] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I0831 23:05:28.568230       1 server_linux.go:146] "No iptables support for family" ipFamily="IPv6"
	I0831 23:05:28.568276       1 server.go:245] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0831 23:05:28.568294       1 server_linux.go:169] "Using iptables Proxier"
	I0831 23:05:28.570377       1 proxier.go:255] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I0831 23:05:28.570907       1 server.go:483] "Version info" version="v1.31.0"
	I0831 23:05:28.570936       1 server.go:485] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 23:05:28.572169       1 config.go:197] "Starting service config controller"
	I0831 23:05:28.572437       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0831 23:05:28.572558       1 config.go:104] "Starting endpoint slice config controller"
	I0831 23:05:28.572595       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0831 23:05:28.574453       1 config.go:326] "Starting node config controller"
	I0831 23:05:28.574459       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0831 23:05:28.673844       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0831 23:05:28.673918       1 shared_informer.go:320] Caches are synced for service config
	I0831 23:05:28.676078       1 shared_informer.go:320] Caches are synced for node config
	
	
	==> kube-proxy [d6ba988e6369] <==
		add table ip kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	E0831 22:57:37.621510       1 proxier.go:734] "Error cleaning up nftables rules" err=<
		could not run nftables command: /dev/stdin:1:1-25: Error: Could not process rule: Operation not supported
		add table ip6 kube-proxy
		^^^^^^^^^^^^^^^^^^^^^^^^^
	 >
	I0831 22:57:37.628476       1 server.go:677] "Successfully retrieved node IP(s)" IPs=["192.169.0.13"]
	E0831 22:57:37.628526       1 server.go:234] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I0831 22:57:37.676550       1 server_linux.go:146] "No iptables support for family" ipFamily="IPv6"
	I0831 22:57:37.676608       1 server.go:245] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0831 22:57:37.676626       1 server_linux.go:169] "Using iptables Proxier"
	I0831 22:57:37.687655       1 proxier.go:255] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I0831 22:57:37.687963       1 server.go:483] "Version info" version="v1.31.0"
	I0831 22:57:37.687973       1 server.go:485] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 22:57:37.702714       1 config.go:197] "Starting service config controller"
	I0831 22:57:37.703175       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0831 22:57:37.703368       1 config.go:104] "Starting endpoint slice config controller"
	I0831 22:57:37.703410       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0831 22:57:37.703846       1 config.go:326] "Starting node config controller"
	I0831 22:57:37.703925       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0831 22:57:37.804187       1 shared_informer.go:320] Caches are synced for node config
	I0831 22:57:37.804267       1 shared_informer.go:320] Caches are synced for service config
	I0831 22:57:37.804281       1 shared_informer.go:320] Caches are synced for endpoint slice config
	
	
	==> kube-scheduler [47934ef0bc6f] <==
	W0831 22:57:29.086589       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0831 22:57:29.086634       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"statefulsets\" in API group \"apps\" at the cluster scope" logger="UnhandledError"
	W0831 22:57:29.087077       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0831 22:57:29.087143       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicasets\" in API group \"apps\" at the cluster scope" logger="UnhandledError"
	E0831 22:57:29.085404       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError"
	W0831 22:57:29.087337       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0831 22:57:29.087753       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0831 22:57:29.087787       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User \"system:kube-scheduler\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError"
	E0831 22:57:29.087530       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"storageclasses\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError"
	W0831 22:57:29.087433       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0831 22:57:29.087873       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumeclaims\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:57:29.087443       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0831 22:57:29.087911       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csinodes\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError"
	W0831 22:57:29.890506       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csistoragecapacities" in API group "storage.k8s.io" at the cluster scope
	E0831 22:57:29.890698       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csistoragecapacities\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError"
	W0831 22:57:29.896139       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	E0831 22:57:29.896200       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicationcontrollers\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:57:30.005033       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	E0831 22:57:30.005081       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Namespace: failed to list *v1.Namespace: namespaces is forbidden: User \"system:kube-scheduler\" cannot list resource \"namespaces\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:57:30.107372       1 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	E0831 22:57:30.107461       1 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumes\" in API group \"\" at the cluster scope" logger="UnhandledError"
	W0831 22:57:30.141915       1 reflector.go:561] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0831 22:57:30.141959       1 reflector.go:158] "Unhandled Error" err="runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"extension-apiserver-authentication\" is forbidden: User \"system:kube-scheduler\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\"" logger="UnhandledError"
	I0831 22:57:32.879579       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	E0831 23:00:36.399970       1 run.go:72] "command failed" err="finished without leader elect"
	
	
	==> kube-scheduler [a89caa1cde06] <==
	I0831 23:05:25.417449       1 serving.go:386] Generated self-signed cert in-memory
	W0831 23:05:26.399693       1 requestheader_controller.go:196] Unable to get configmap/extension-apiserver-authentication in kube-system.  Usually fixed by 'kubectl create rolebinding -n kube-system ROLEBINDING_NAME --role=extension-apiserver-authentication-reader --serviceaccount=YOUR_NS:YOUR_SA'
	W0831 23:05:26.399836       1 authentication.go:370] Error looking up in-cluster authentication configuration: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot get resource "configmaps" in API group "" in the namespace "kube-system"
	W0831 23:05:26.399956       1 authentication.go:371] Continuing without authentication configuration. This may treat all requests as anonymous.
	W0831 23:05:26.400041       1 authentication.go:372] To require authentication configuration lookup to succeed, set --authentication-tolerate-lookup-failure=false
	I0831 23:05:26.420187       1 server.go:167] "Starting Kubernetes Scheduler" version="v1.31.0"
	I0831 23:05:26.420359       1 server.go:169] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0831 23:05:26.422091       1 secure_serving.go:213] Serving securely on 127.0.0.1:10259
	I0831 23:05:26.422196       1 configmap_cafile_content.go:205] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I0831 23:05:26.422941       1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I0831 23:05:26.422290       1 tlsconfig.go:243] "Starting DynamicServingCertificateController"
	I0831 23:05:26.524198       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Aug 31 23:05:38 multinode-957000 kubelet[1350]: E0831 23:05:38.501015    1350 kubelet.go:2901] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:docker: network plugin is not ready: cni config uninitialized"
	Aug 31 23:05:38 multinode-957000 kubelet[1350]: E0831 23:05:38.509586    1350 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:docker: network plugin is not ready: cni config uninitialized" pod="kube-system/coredns-6f6b679f8f-q4s6r" podUID="b794efa0-8367-452b-90be-870e8d349f6f"
	Aug 31 23:05:39 multinode-957000 kubelet[1350]: E0831 23:05:39.509920    1350 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:docker: network plugin is not ready: cni config uninitialized" pod="default/busybox-7dff88458-9qs4p" podUID="f156180c-4f6f-40b1-9535-1be92f5e3208"
	Aug 31 23:05:40 multinode-957000 kubelet[1350]: E0831 23:05:40.510893    1350 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:docker: network plugin is not ready: cni config uninitialized" pod="kube-system/coredns-6f6b679f8f-q4s6r" podUID="b794efa0-8367-452b-90be-870e8d349f6f"
	Aug 31 23:05:41 multinode-957000 kubelet[1350]: E0831 23:05:41.509851    1350 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:docker: network plugin is not ready: cni config uninitialized" pod="default/busybox-7dff88458-9qs4p" podUID="f156180c-4f6f-40b1-9535-1be92f5e3208"
	Aug 31 23:05:42 multinode-957000 kubelet[1350]: E0831 23:05:42.510259    1350 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:docker: network plugin is not ready: cni config uninitialized" pod="kube-system/coredns-6f6b679f8f-q4s6r" podUID="b794efa0-8367-452b-90be-870e8d349f6f"
	Aug 31 23:05:43 multinode-957000 kubelet[1350]: E0831 23:05:43.191758    1350 configmap.go:193] Couldn't get configMap kube-system/coredns: object "kube-system"/"coredns" not registered
	Aug 31 23:05:43 multinode-957000 kubelet[1350]: E0831 23:05:43.191881    1350 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b794efa0-8367-452b-90be-870e8d349f6f-config-volume podName:b794efa0-8367-452b-90be-870e8d349f6f nodeName:}" failed. No retries permitted until 2024-08-31 23:05:59.191862731 +0000 UTC m=+35.833057592 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/b794efa0-8367-452b-90be-870e8d349f6f-config-volume") pod "coredns-6f6b679f8f-q4s6r" (UID: "b794efa0-8367-452b-90be-870e8d349f6f") : object "kube-system"/"coredns" not registered
	Aug 31 23:05:43 multinode-957000 kubelet[1350]: E0831 23:05:43.292597    1350 projected.go:288] Couldn't get configMap default/kube-root-ca.crt: object "default"/"kube-root-ca.crt" not registered
	Aug 31 23:05:43 multinode-957000 kubelet[1350]: E0831 23:05:43.292940    1350 projected.go:194] Error preparing data for projected volume kube-api-access-jbc6k for pod default/busybox-7dff88458-9qs4p: object "default"/"kube-root-ca.crt" not registered
	Aug 31 23:05:43 multinode-957000 kubelet[1350]: E0831 23:05:43.293214    1350 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f156180c-4f6f-40b1-9535-1be92f5e3208-kube-api-access-jbc6k podName:f156180c-4f6f-40b1-9535-1be92f5e3208 nodeName:}" failed. No retries permitted until 2024-08-31 23:05:59.29319159 +0000 UTC m=+35.934386449 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-jbc6k" (UniqueName: "kubernetes.io/projected/f156180c-4f6f-40b1-9535-1be92f5e3208-kube-api-access-jbc6k") pod "busybox-7dff88458-9qs4p" (UID: "f156180c-4f6f-40b1-9535-1be92f5e3208") : object "default"/"kube-root-ca.crt" not registered
	Aug 31 23:05:58 multinode-957000 kubelet[1350]: I0831 23:05:58.942803    1350 scope.go:117] "RemoveContainer" containerID="93e675b8bc50878e755fa9ddfebc5e10dea746b7873ef297eb63521ec64eee7c"
	Aug 31 23:05:58 multinode-957000 kubelet[1350]: I0831 23:05:58.943049    1350 scope.go:117] "RemoveContainer" containerID="10dcf0ab9505d4a779752a1521968e0396d569b4d15a732ec4b5ab45ace0c6d6"
	Aug 31 23:05:58 multinode-957000 kubelet[1350]: E0831 23:05:58.943131    1350 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-provisioner\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-provisioner pod=storage-provisioner_kube-system(f389bc9a-20cc-4e07-bc7f-f418f53773c9)\"" pod="kube-system/storage-provisioner" podUID="f389bc9a-20cc-4e07-bc7f-f418f53773c9"
	Aug 31 23:06:13 multinode-957000 kubelet[1350]: I0831 23:06:13.510820    1350 scope.go:117] "RemoveContainer" containerID="10dcf0ab9505d4a779752a1521968e0396d569b4d15a732ec4b5ab45ace0c6d6"
	Aug 31 23:06:23 multinode-957000 kubelet[1350]: E0831 23:06:23.539082    1350 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 23:06:23 multinode-957000 kubelet[1350]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 23:06:23 multinode-957000 kubelet[1350]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 23:06:23 multinode-957000 kubelet[1350]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 23:06:23 multinode-957000 kubelet[1350]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Aug 31 23:07:23 multinode-957000 kubelet[1350]: E0831 23:07:23.534233    1350 iptables.go:577] "Could not set up iptables canary" err=<
	Aug 31 23:07:23 multinode-957000 kubelet[1350]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Aug 31 23:07:23 multinode-957000 kubelet[1350]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Aug 31 23:07:23 multinode-957000 kubelet[1350]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Aug 31 23:07:23 multinode-957000 kubelet[1350]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:255: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p multinode-957000 -n multinode-957000
helpers_test.go:262: (dbg) Run:  kubectl --context multinode-957000 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:286: <<< TestMultiNode/serial/RestartMultiNode FAILED: end of post-mortem logs <<<
helpers_test.go:287: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiNode/serial/RestartMultiNode (189.56s)

                                                
                                    
x
+
TestScheduledStopUnix (86.1s)

                                                
                                                
=== RUN   TestScheduledStopUnix
scheduled_stop_test.go:128: (dbg) Run:  out/minikube-darwin-amd64 start -p scheduled-stop-209000 --memory=2048 --driver=hyperkit 
scheduled_stop_test.go:128: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p scheduled-stop-209000 --memory=2048 --driver=hyperkit : exit status 80 (1m20.759745328s)

                                                
                                                
-- stdout --
	* [scheduled-stop-209000] minikube v1.33.1 on Darwin 14.6.1
	  - MINIKUBE_LOCATION=18943
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting "scheduled-stop-209000" primary control-plane node in "scheduled-stop-209000" cluster
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	* Deleting "scheduled-stop-209000" in hyperkit ...
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: unexpected EOF
	* Failed to start hyperkit VM. Running "minikube delete -p scheduled-stop-209000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for f2:d0:3b:61:6e:e0
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for f2:d0:3b:61:6e:e0
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
scheduled_stop_test.go:130: starting minikube: exit status 80

                                                
                                                
-- stdout --
	* [scheduled-stop-209000] minikube v1.33.1 on Darwin 14.6.1
	  - MINIKUBE_LOCATION=18943
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting "scheduled-stop-209000" primary control-plane node in "scheduled-stop-209000" cluster
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	* Deleting "scheduled-stop-209000" in hyperkit ...
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: unexpected EOF
	* Failed to start hyperkit VM. Running "minikube delete -p scheduled-stop-209000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for f2:d0:3b:61:6e:e0
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for f2:d0:3b:61:6e:e0
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
panic.go:626: *** TestScheduledStopUnix FAILED at 2024-08-31 16:12:29.670759 -0700 PDT m=+4046.870037099
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-209000 -n scheduled-stop-209000
helpers_test.go:240: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-209000 -n scheduled-stop-209000: exit status 7 (77.770584ms)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0831 16:12:29.746904    5769 status.go:352] failed to get driver ip: getting IP: IP address is not set
	E0831 16:12:29.746925    5769 status.go:249] status error: getting IP: IP address is not set

                                                
                                                
** /stderr **
helpers_test.go:240: status error: exit status 7 (may be ok)
helpers_test.go:242: "scheduled-stop-209000" host is not running, skipping log retrieval (state="Error")
helpers_test.go:176: Cleaning up "scheduled-stop-209000" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 delete -p scheduled-stop-209000
helpers_test.go:179: (dbg) Done: out/minikube-darwin-amd64 delete -p scheduled-stop-209000: (5.257060888s)
--- FAIL: TestScheduledStopUnix (86.10s)

                                                
                                    
x
+
TestPause/serial/Start (141.96s)

                                                
                                                
=== RUN   TestPause/serial/Start
pause_test.go:80: (dbg) Run:  out/minikube-darwin-amd64 start -p pause-512000 --memory=2048 --install-addons=false --wait=all --driver=hyperkit 
pause_test.go:80: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p pause-512000 --memory=2048 --install-addons=false --wait=all --driver=hyperkit : exit status 80 (2m21.861364509s)

                                                
                                                
-- stdout --
	* [pause-512000] minikube v1.33.1 on Darwin 14.6.1
	  - MINIKUBE_LOCATION=18943
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting "pause-512000" primary control-plane node in "pause-512000" cluster
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	* Deleting "pause-512000" in hyperkit ...
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 3a:b5:ca:30:43:8f
	* Failed to start hyperkit VM. Running "minikube delete -p pause-512000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for c2:34:a5:f6:91:33
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for c2:34:a5:f6:91:33
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
pause_test.go:82: failed to start minikube with args: "out/minikube-darwin-amd64 start -p pause-512000 --memory=2048 --install-addons=false --wait=all --driver=hyperkit " : exit status 80
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p pause-512000 -n pause-512000
helpers_test.go:240: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p pause-512000 -n pause-512000: exit status 7 (94.197886ms)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0831 16:54:46.220231    8301 status.go:352] failed to get driver ip: getting IP: IP address is not set
	E0831 16:54:46.220250    8301 status.go:249] status error: getting IP: IP address is not set

                                                
                                                
** /stderr **
helpers_test.go:240: status error: exit status 7 (may be ok)
helpers_test.go:242: "pause-512000" host is not running, skipping log retrieval (state="Error")
--- FAIL: TestPause/serial/Start (141.96s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/SecondStart (7201.72s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p old-k8s-version-553000 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.20.0
E0831 17:04:57.433477    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/custom-flannel-942000/client.crt: no such file or directory" logger="UnhandledError"
panic: test timed out after 2h0m0s
running tests:
	TestNetworkPlugins (50m38s)
	TestNetworkPlugins/group (2m6s)
	TestStartStop (37m56s)
	TestStartStop/group/no-preload (2m6s)
	TestStartStop/group/no-preload/serial (2m6s)
	TestStartStop/group/no-preload/serial/SecondStart (54s)
	TestStartStop/group/old-k8s-version (3m9s)
	TestStartStop/group/old-k8s-version/serial (3m9s)
	TestStartStop/group/old-k8s-version/serial/SecondStart (10s)

                                                
                                                
goroutine 4217 [running]:
testing.(*M).startAlarm.func1()
	/usr/local/go/src/testing/testing.go:2366 +0x385
created by time.goFunc
	/usr/local/go/src/time/sleep.go:177 +0x2d

                                                
                                                
goroutine 1 [chan receive, 15 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1650 +0x4ab
testing.tRunner(0xc000469520, 0xc0009efbb0)
	/usr/local/go/src/testing/testing.go:1695 +0x134
testing.runTests(0xc0000be4b0, {0xe975cc0, 0x2b, 0x2b}, {0x9fa56c5?, 0xbcd081c?, 0xe9987c0?})
	/usr/local/go/src/testing/testing.go:2159 +0x445
testing.(*M).Run(0xc000964a00)
	/usr/local/go/src/testing/testing.go:2027 +0x68b
k8s.io/minikube/test/integration.TestMain(0xc000964a00)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/main_test.go:62 +0x8b
main.main()
	_testmain.go:133 +0x195

                                                
                                                
goroutine 9 [select]:
go.opencensus.io/stats/view.(*worker).start(0xc000646e00)
	/var/lib/jenkins/go/pkg/mod/go.opencensus.io@v0.24.0/stats/view/worker.go:292 +0x9f
created by go.opencensus.io/stats/view.init.0 in goroutine 1
	/var/lib/jenkins/go/pkg/mod/go.opencensus.io@v0.24.0/stats/view/worker.go:34 +0x8d

                                                
                                                
goroutine 4129 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 4112
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 149 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 148
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3645 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xd40a010, 0xc00014a060}, 0xc001e1f750, 0xc001e1f798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xd40a010, 0xc00014a060}, 0x40?, 0xc001e1f750, 0xc001e1f798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xd40a010?, 0xc00014a060?}, 0x293030303133302d?, 0x64207c2047424420?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc001e1f7d0?, 0xa05f844?, 0xc000058e40?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3661
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:145 +0x29a

                                                
                                                
goroutine 147 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0xc000874f90, 0x2d)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc0013e4d80?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0xd424120)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/queue.go:282 +0x98
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc000874fc0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:159 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:154
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc000888010, {0xd3e3ce0, 0xc0007cd110}, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc000888010, 0x3b9aca00, 0x0, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 141
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:143 +0x1ef

                                                
                                                
goroutine 28 [select]:
k8s.io/klog/v2.(*flushDaemon).run.func1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/klog/v2@v2.130.1/klog.go:1141 +0x117
created by k8s.io/klog/v2.(*flushDaemon).run in goroutine 27
	/var/lib/jenkins/go/pkg/mod/k8s.io/klog/v2@v2.130.1/klog.go:1137 +0x171

                                                
                                                
goroutine 148 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xd40a010, 0xc00014a060}, 0xc000b7b750, 0xc0008aff98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xd40a010, 0xc00014a060}, 0x70?, 0xc000b7b750, 0xc000b7b798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xd40a010?, 0xc00014a060?}, 0x53203a6874615074?, 0x6e4d5674656b636f?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x50746e6567414853?, 0x55504720303a4449?, 0x506f747541203a73?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 141
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:145 +0x29a

                                                
                                                
goroutine 3902 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xd40a010, 0xc00014a060}, 0xc001e19f50, 0xc001e19f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xd40a010, 0xc00014a060}, 0x10?, 0xc001e19f50, 0xc001e19f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xd40a010?, 0xc00014a060?}, 0xc00184c340?, 0xa019540?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc001e19fd0?, 0xa05f844?, 0xc001d32120?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3905
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:145 +0x29a

                                                
                                                
goroutine 2912 [chan receive, 38 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1650 +0x4ab
testing.tRunner(0xc000c3a000, 0xd3d5b68)
	/usr/local/go/src/testing/testing.go:1695 +0x134
created by testing.(*T).Run in goroutine 2579
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 1642 [chan send, 97 minutes]:
os/exec.(*Cmd).watchCtx(0xc001b51980, 0xc001b3b3e0)
	/usr/local/go/src/os/exec/exec.go:793 +0x3ff
created by os/exec.(*Cmd).Start in goroutine 1279
	/usr/local/go/src/os/exec/exec.go:754 +0x976

                                                
                                                
goroutine 2542 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0xc001f162d0, 0x1c)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc0008b2d80?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0xd424120)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/queue.go:282 +0x98
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc001f16300)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:159 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:154
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0016f6100, {0xd3e3ce0, 0xc001550180}, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0016f6100, 0x3b9aca00, 0x0, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 2555
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:143 +0x1ef

                                                
                                                
goroutine 1366 [chan receive, 97 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc00171f780, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:150 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 1304
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cache.go:122 +0x585

                                                
                                                
goroutine 3901 [sync.Cond.Wait, 3 minutes]:
sync.runtime_notifyListWait(0xc000908e90, 0x0)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc001e1bd80?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0xd424120)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/queue.go:282 +0x98
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc000908ec0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:159 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:154
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc000881240, {0xd3e3ce0, 0xc001b68450}, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc000881240, 0x3b9aca00, 0x0, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3905
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:143 +0x1ef

                                                
                                                
goroutine 141 [chan receive, 117 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc000874fc0, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:150 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 139
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cache.go:122 +0x585

                                                
                                                
goroutine 140 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0xd4006c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:304 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 139
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:141 +0x238

                                                
                                                
goroutine 3188 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0xd4006c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:304 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 3187
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:141 +0x238

                                                
                                                
goroutine 4177 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xd40a010, 0xc00014a060}, 0xc000b7a750, 0xc000b7a798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xd40a010, 0xc00014a060}, 0xe0?, 0xc000b7a750, 0xc000b7a798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xd40a010?, 0xc00014a060?}, 0xc00184c4e0?, 0xa019540?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xa05f7e5?, 0xc000c3e780?, 0xc001906de0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 4174
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:145 +0x29a

                                                
                                                
goroutine 4169 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0x56420a50, 0x72)
	/usr/local/go/src/runtime/netpoll.go:345 +0x85
internal/poll.(*pollDesc).wait(0xc001fd2c00?, 0xc0013ecc90?, 0x1)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x27
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0xc001fd2c00, {0xc0013ecc90, 0x370, 0x370})
	/usr/local/go/src/internal/poll/fd_unix.go:164 +0x27a
os.(*File).read(...)
	/usr/local/go/src/os/file_posix.go:29
os.(*File).Read(0xc0014e4568, {0xc0013ecc90?, 0xa05d9da?, 0x228?})
	/usr/local/go/src/os/file.go:118 +0x52
bytes.(*Buffer).ReadFrom(0xc001f12cf0, {0xd3e26a8, 0xc001512c78})
	/usr/local/go/src/bytes/buffer.go:211 +0x98
io.copyBuffer({0xd3e27e8, 0xc001f12cf0}, {0xd3e26a8, 0xc001512c78}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:415 +0x151
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os.genericWriteTo(0xe8a7b80?, {0xd3e27e8, 0xc001f12cf0})
	/usr/local/go/src/os/file.go:269 +0x58
os.(*File).WriteTo(0xe935d10?, {0xd3e27e8?, 0xc001f12cf0?})
	/usr/local/go/src/os/file.go:247 +0x49
io.copyBuffer({0xd3e27e8, 0xc001f12cf0}, {0xd3e2768, 0xc0014e4568}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:411 +0x9d
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os/exec.(*Cmd).writerDescriptor.func1()
	/usr/local/go/src/os/exec/exec.go:578 +0x34
os/exec.(*Cmd).Start.func2(0xc001916f80?)
	/usr/local/go/src/os/exec/exec.go:728 +0x2c
created by os/exec.(*Cmd).Start in goroutine 4168
	/usr/local/go/src/os/exec/exec.go:727 +0x9ae

                                                
                                                
goroutine 2913 [chan receive, 3 minutes]:
testing.(*T).Run(0xc000c3a1a0, {0xbc775f6?, 0x0?}, 0xc00162d500)
	/usr/local/go/src/testing/testing.go:1750 +0x3ab
k8s.io/minikube/test/integration.TestStartStop.func1.1(0xc000c3a1a0)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:130 +0xad9
testing.tRunner(0xc000c3a1a0, 0xc001f161c0)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 2912
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 1334 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0xc00171f750, 0x28)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc0013dfd80?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0xd424120)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/queue.go:282 +0x98
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc00171f780)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:159 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:154
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc001828a80, {0xd3e3ce0, 0xc0013acf60}, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc001828a80, 0x3b9aca00, 0x0, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 1366
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:143 +0x1ef

                                                
                                                
goroutine 2593 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0xc0008158b0)
	/usr/local/go/src/testing/testing.go:1817 +0xac
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1665 +0x5e9
testing.tRunner(0xc0013feea0, 0xc001826888)
	/usr/local/go/src/testing/testing.go:1695 +0x134
created by testing.(*T).Run in goroutine 2495
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 3531 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3530
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 1575 [chan send, 97 minutes]:
os/exec.(*Cmd).watchCtx(0xc001d06300, 0xc0019076e0)
	/usr/local/go/src/os/exec/exec.go:793 +0x3ff
created by os/exec.(*Cmd).Start in goroutine 1574
	/usr/local/go/src/os/exec/exec.go:754 +0x976

                                                
                                                
goroutine 3529 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0xc000c06bd0, 0xe)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc001467d80?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0xd424120)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/queue.go:282 +0x98
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc000c06c00)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:159 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:154
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0016f66b0, {0xd3e3ce0, 0xc0016b38c0}, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0016f66b0, 0x3b9aca00, 0x0, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3545
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:143 +0x1ef

                                                
                                                
goroutine 4111 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0xc0018411d0, 0x0)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc000505580?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0xd424120)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/queue.go:282 +0x98
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc001841200)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:159 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:154
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0008879a0, {0xd3e3ce0, 0xc00179a8d0}, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0008879a0, 0x3b9aca00, 0x0, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 4121
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:143 +0x1ef

                                                
                                                
goroutine 3646 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3645
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3321 [sync.Cond.Wait]:
sync.runtime_notifyListWait(0xc001768450, 0x10)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc00146bd80?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0xd424120)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/queue.go:282 +0x98
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc001768480)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:159 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:154
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc000a78a00, {0xd3e3ce0, 0xc001642bd0}, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc000a78a00, 0x3b9aca00, 0x0, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3309
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:143 +0x1ef

                                                
                                                
goroutine 2543 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xd40a010, 0xc00014a060}, 0xc000b7b750, 0xc0013def98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xd40a010, 0xc00014a060}, 0xd0?, 0xc000b7b750, 0xc000b7b798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xd40a010?, 0xc00014a060?}, 0xc0015d9440?, 0xc0006cfe00?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc000b7b7d0?, 0xa05f844?, 0xc000003380?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 2555
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:145 +0x29a

                                                
                                                
goroutine 2555 [chan receive, 52 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc001f16300, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:150 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 2510
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cache.go:122 +0x585

                                                
                                                
goroutine 1541 [chan send, 97 minutes]:
os/exec.(*Cmd).watchCtx(0xc001ac7980, 0xc0019e5f80)
	/usr/local/go/src/os/exec/exec.go:793 +0x3ff
created by os/exec.(*Cmd).Start in goroutine 1540
	/usr/local/go/src/os/exec/exec.go:754 +0x976

                                                
                                                
goroutine 1170 [IO wait, 101 minutes]:
internal/poll.runtime_pollWait(0x56420c40, 0x72)
	/usr/local/go/src/runtime/netpoll.go:345 +0x85
internal/poll.(*pollDesc).wait(0xc001424100?, 0x3fe?, 0x0)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x27
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0xc001424100)
	/usr/local/go/src/internal/poll/fd_unix.go:611 +0x2ac
net.(*netFD).accept(0xc001424100)
	/usr/local/go/src/net/fd_unix.go:172 +0x29
net.(*TCPListener).accept(0xc0016481c0)
	/usr/local/go/src/net/tcpsock_posix.go:159 +0x1e
net.(*TCPListener).Accept(0xc0016481c0)
	/usr/local/go/src/net/tcpsock.go:327 +0x30
net/http.(*Server).Serve(0xc0002305a0, {0xd3fcae0, 0xc0016481c0})
	/usr/local/go/src/net/http/server.go:3260 +0x33e
net/http.(*Server).ListenAndServe(0xc0002305a0)
	/usr/local/go/src/net/http/server.go:3189 +0x71
k8s.io/minikube/test/integration.startHTTPProxy.func1(0xc0017344e0?, 0xc0017344e0)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/functional_test.go:2213 +0x18
created by k8s.io/minikube/test/integration.startHTTPProxy in goroutine 1151
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/functional_test.go:2212 +0x129

                                                
                                                
goroutine 3308 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0xd4006c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:304 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 3304
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:141 +0x238

                                                
                                                
goroutine 4003 [sync.Cond.Wait, 3 minutes]:
sync.runtime_notifyListWait(0xc001f17490, 0x0)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc000b7b580?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0xd424120)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/queue.go:282 +0x98
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc001f174c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:159 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:154
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc000c7c9c0, {0xd3e3ce0, 0xc0014c0990}, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc000c7c9c0, 0x3b9aca00, 0x0, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 4027
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:143 +0x1ef

                                                
                                                
goroutine 3443 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0xc001768350, 0xf)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc0008acd80?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0xd424120)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/queue.go:282 +0x98
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc001768380)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:159 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:154
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0017c2330, {0xd3e3ce0, 0xc001642300}, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0017c2330, 0x3b9aca00, 0x0, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3430
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:143 +0x1ef

                                                
                                                
goroutine 4121 [chan receive, 2 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc001841200, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:150 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 4116
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cache.go:122 +0x585

                                                
                                                
goroutine 2495 [chan receive, 52 minutes]:
testing.(*T).Run(0xc0013fe1a0, {0xbc75fa6?, 0x3baad18d7dd?}, 0xc001826888)
	/usr/local/go/src/testing/testing.go:1750 +0x3ab
k8s.io/minikube/test/integration.TestNetworkPlugins(0xc0013fe1a0)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/net_test.go:52 +0xd4
testing.tRunner(0xc0013fe1a0, 0xd3d59c0)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 1
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 1336 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 1335
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 4174 [chan receive, 2 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc001769cc0, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:150 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 4156
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cache.go:122 +0x585

                                                
                                                
goroutine 4170 [IO wait]:
internal/poll.runtime_pollWait(0x56420860, 0x72)
	/usr/local/go/src/runtime/netpoll.go:345 +0x85
internal/poll.(*pollDesc).wait(0xc001fd2cc0?, 0xc001985e78?, 0x1)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x27
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0xc001fd2cc0, {0xc001985e78, 0x1e188, 0x1e188})
	/usr/local/go/src/internal/poll/fd_unix.go:164 +0x27a
os.(*File).read(...)
	/usr/local/go/src/os/file_posix.go:29
os.(*File).Read(0xc0014e4598, {0xc001985e78?, 0xc0018d6000?, 0x1fe76?})
	/usr/local/go/src/os/file.go:118 +0x52
bytes.(*Buffer).ReadFrom(0xc001f12d20, {0xd3e26a8, 0xc001512c80})
	/usr/local/go/src/bytes/buffer.go:211 +0x98
io.copyBuffer({0xd3e27e8, 0xc001f12d20}, {0xd3e26a8, 0xc001512c80}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:415 +0x151
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os.genericWriteTo(0x0?, {0xd3e27e8, 0xc001f12d20})
	/usr/local/go/src/os/file.go:269 +0x58
os.(*File).WriteTo(0xe935d10?, {0xd3e27e8?, 0xc001f12d20?})
	/usr/local/go/src/os/file.go:247 +0x49
io.copyBuffer({0xd3e27e8, 0xc001f12d20}, {0xd3e2768, 0xc0014e4598}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:411 +0x9d
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os/exec.(*Cmd).writerDescriptor.func1()
	/usr/local/go/src/os/exec/exec.go:578 +0x34
os/exec.(*Cmd).Start.func2(0xc0018d9680?)
	/usr/local/go/src/os/exec/exec.go:728 +0x2c
created by os/exec.(*Cmd).Start in goroutine 4168
	/usr/local/go/src/os/exec/exec.go:727 +0x9ae

                                                
                                                
goroutine 3789 [chan receive, 5 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc001769240, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:150 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3784
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cache.go:122 +0x585

                                                
                                                
goroutine 3323 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3322
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3089 [chan receive, 9 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc00171e600, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:150 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3067
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cache.go:122 +0x585

                                                
                                                
goroutine 4093 [chan receive, 2 minutes]:
testing.(*T).Run(0xc00184c000, {0xbc83236?, 0x60400000004?}, 0xc001916f80)
	/usr/local/go/src/testing/testing.go:1750 +0x3ab
k8s.io/minikube/test/integration.TestStartStop.func1.1.1(0xc00184c000)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:155 +0x2af
testing.tRunner(0xc00184c000, 0xc001b76280)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 2916
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 3430 [chan receive, 7 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc001768380, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:150 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3423
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cache.go:122 +0x585

                                                
                                                
goroutine 2918 [chan receive, 38 minutes]:
testing.(*testContext).waitParallel(0xc0008158b0)
	/usr/local/go/src/testing/testing.go:1817 +0xac
testing.(*T).Parallel(0xc000c3ab60)
	/usr/local/go/src/testing/testing.go:1484 +0x229
k8s.io/minikube/test/integration.MaybeParallel(0xc000c3ab60)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/helpers_test.go:484 +0x34
k8s.io/minikube/test/integration.TestStartStop.func1.1(0xc000c3ab60)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:94 +0x45
testing.tRunner(0xc000c3ab60, 0xc001f16400)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 2912
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 1365 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0xd4006c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:304 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 1304
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:141 +0x238

                                                
                                                
goroutine 1766 [select, 97 minutes]:
net/http.(*persistConn).writeLoop(0xc001eab200)
	/usr/local/go/src/net/http/transport.go:2458 +0xf0
created by net/http.(*Transport).dialConn in goroutine 1755
	/usr/local/go/src/net/http/transport.go:1800 +0x1585

                                                
                                                
goroutine 2915 [chan receive, 38 minutes]:
testing.(*testContext).waitParallel(0xc0008158b0)
	/usr/local/go/src/testing/testing.go:1817 +0xac
testing.(*T).Parallel(0xc000c3a680)
	/usr/local/go/src/testing/testing.go:1484 +0x229
k8s.io/minikube/test/integration.MaybeParallel(0xc000c3a680)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/helpers_test.go:484 +0x34
k8s.io/minikube/test/integration.TestStartStop.func1.1(0xc000c3a680)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:94 +0x45
testing.tRunner(0xc000c3a680, 0xc001f16240)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 2912
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 4225 [IO wait]:
internal/poll.runtime_pollWait(0x56420290, 0x72)
	/usr/local/go/src/runtime/netpoll.go:345 +0x85
internal/poll.(*pollDesc).wait(0xc00174e4e0?, 0xc000be45b2?, 0x1)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x27
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0xc00174e4e0, {0xc000be45b2, 0x1a4e, 0x1a4e})
	/usr/local/go/src/internal/poll/fd_unix.go:164 +0x27a
os.(*File).read(...)
	/usr/local/go/src/os/file_posix.go:29
os.(*File).Read(0xc001512290, {0xc000be45b2?, 0xc000704a80?, 0x3ea0?})
	/usr/local/go/src/os/file.go:118 +0x52
bytes.(*Buffer).ReadFrom(0xc0014e24b0, {0xd3e26a8, 0xc0014e40a0})
	/usr/local/go/src/bytes/buffer.go:211 +0x98
io.copyBuffer({0xd3e27e8, 0xc0014e24b0}, {0xd3e26a8, 0xc0014e40a0}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:415 +0x151
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os.genericWriteTo(0xc001527e78?, {0xd3e27e8, 0xc0014e24b0})
	/usr/local/go/src/os/file.go:269 +0x58
os.(*File).WriteTo(0xe935d10?, {0xd3e27e8?, 0xc0014e24b0?})
	/usr/local/go/src/os/file.go:247 +0x49
io.copyBuffer({0xd3e27e8, 0xc0014e24b0}, {0xd3e2768, 0xc001512290}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:411 +0x9d
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os/exec.(*Cmd).writerDescriptor.func1()
	/usr/local/go/src/os/exec/exec.go:578 +0x34
os/exec.(*Cmd).Start.func2(0xc001e54060?)
	/usr/local/go/src/os/exec/exec.go:728 +0x2c
created by os/exec.(*Cmd).Start in goroutine 4207
	/usr/local/go/src/os/exec/exec.go:727 +0x9ae

                                                
                                                
goroutine 2914 [chan receive, 38 minutes]:
testing.(*testContext).waitParallel(0xc0008158b0)
	/usr/local/go/src/testing/testing.go:1817 +0xac
testing.(*T).Parallel(0xc000c3a340)
	/usr/local/go/src/testing/testing.go:1484 +0x229
k8s.io/minikube/test/integration.MaybeParallel(0xc000c3a340)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/helpers_test.go:484 +0x34
k8s.io/minikube/test/integration.TestStartStop.func1.1(0xc000c3a340)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:94 +0x45
testing.tRunner(0xc000c3a340, 0xc001f16200)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 2912
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 2916 [chan receive, 2 minutes]:
testing.(*T).Run(0xc000c3a820, {0xbc775f6?, 0x0?}, 0xc001b76280)
	/usr/local/go/src/testing/testing.go:1750 +0x3ab
k8s.io/minikube/test/integration.TestStartStop.func1.1(0xc000c3a820)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:130 +0xad9
testing.tRunner(0xc000c3a820, 0xc001f16340)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 2912
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 1765 [select, 97 minutes]:
net/http.(*persistConn).readLoop(0xc001eab200)
	/usr/local/go/src/net/http/transport.go:2261 +0xd3a
created by net/http.(*Transport).dialConn in goroutine 1755
	/usr/local/go/src/net/http/transport.go:1799 +0x152f

                                                
                                                
goroutine 2554 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0xd4006c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:304 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 2510
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:141 +0x238

                                                
                                                
goroutine 1335 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xd40a010, 0xc00014a060}, 0xc000b7c750, 0xc0014cdf98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xd40a010, 0xc00014a060}, 0x18?, 0xc000b7c750, 0xc000b7c798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xd40a010?, 0xc00014a060?}, 0xc000c3a340?, 0xa019540?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc000b7c7d0?, 0xbc1b002?, 0x0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 1366
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:145 +0x29a

                                                
                                                
goroutine 3429 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0xd4006c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:304 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 3423
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:141 +0x238

                                                
                                                
goroutine 3071 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0xc00171e5d0, 0x10)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc0014c8d80?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0xd424120)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/queue.go:282 +0x98
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc00171e600)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:159 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:154
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0016f67f0, {0xd3e3ce0, 0xc0014e2a20}, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0016f67f0, 0x3b9aca00, 0x0, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3089
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:143 +0x1ef

                                                
                                                
goroutine 1371 [chan send, 97 minutes]:
os/exec.(*Cmd).watchCtx(0xc000c3fb00, 0xc000059680)
	/usr/local/go/src/os/exec/exec.go:793 +0x3ff
created by os/exec.(*Cmd).Start in goroutine 1370
	/usr/local/go/src/os/exec/exec.go:754 +0x976

                                                
                                                
goroutine 4112 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xd40a010, 0xc00014a060}, 0xc001e1ff50, 0xc001e1ff98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xd40a010, 0xc00014a060}, 0x10?, 0xc001e1ff50, 0xc001e1ff98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xd40a010?, 0xc00014a060?}, 0xc000c3bd40?, 0xa019540?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc001e1ffd0?, 0xa05f844?, 0xc0013e6900?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 4121
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:145 +0x29a

                                                
                                                
goroutine 3996 [chan receive]:
testing.(*T).Run(0xc000c3b860, {0xbc83236?, 0x60400000004?}, 0xc00162c100)
	/usr/local/go/src/testing/testing.go:1750 +0x3ab
k8s.io/minikube/test/integration.TestStartStop.func1.1.1(0xc000c3b860)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:155 +0x2af
testing.tRunner(0xc000c3b860, 0xc00162d500)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 2913
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 4026 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0xd4006c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:304 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 4022
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:141 +0x238

                                                
                                                
goroutine 2544 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 2543
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 4207 [syscall]:
syscall.syscall6(0xc0014e3f80?, 0x1000000000010?, 0x10000000019?, 0x563454e8?, 0x90?, 0xf432108?, 0x90?)
	/usr/local/go/src/runtime/sys_darwin.go:45 +0x98
syscall.wait4(0xc0008adb48?, 0x9ee60c5?, 0x90?, 0xd33f280?)
	/usr/local/go/src/syscall/zsyscall_darwin_amd64.go:44 +0x45
syscall.Wait4(0xa016885?, 0xc0008adb7c, 0x0?, 0x0?)
	/usr/local/go/src/syscall/syscall_bsd.go:144 +0x25
os.(*Process).wait(0xc001ddc240)
	/usr/local/go/src/os/exec_unix.go:43 +0x6d
os.(*Process).Wait(...)
	/usr/local/go/src/os/exec.go:134
os/exec.(*Cmd).Wait(0xc001714180)
	/usr/local/go/src/os/exec/exec.go:901 +0x45
os/exec.(*Cmd).Run(0xc001714180)
	/usr/local/go/src/os/exec/exec.go:608 +0x2d
k8s.io/minikube/test/integration.Run(0xc000c3b6c0, 0xc001714180)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/helpers_test.go:104 +0x1e5
k8s.io/minikube/test/integration.validateSecondStart({0xd409e50, 0xc0004617a0}, 0xc000c3b6c0, {0xc000a7e4b0, 0x16}, {0x1a82e168018f9758?, 0xc0018f9760?}, {0xa018c13?, 0x9f70c6f?}, {0xc0016fe180, ...})
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:256 +0xe5
k8s.io/minikube/test/integration.TestStartStop.func1.1.1.1(0xc000c3b6c0)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:156 +0x66
testing.tRunner(0xc000c3b6c0, 0xc00162c100)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 3996
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 4208 [IO wait]:
internal/poll.runtime_pollWait(0x56420670, 0x72)
	/usr/local/go/src/runtime/netpoll.go:345 +0x85
internal/poll.(*pollDesc).wait(0xc00174e360?, 0xc000a7a2a6?, 0x1)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x27
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0xc00174e360, {0xc000a7a2a6, 0x55a, 0x55a})
	/usr/local/go/src/internal/poll/fd_unix.go:164 +0x27a
os.(*File).read(...)
	/usr/local/go/src/os/file_posix.go:29
os.(*File).Read(0xc0015121c0, {0xc000a7a2a6?, 0xa05d9da?, 0x263?})
	/usr/local/go/src/os/file.go:118 +0x52
bytes.(*Buffer).ReadFrom(0xc0014e2480, {0xd3e26a8, 0xc0014e4090})
	/usr/local/go/src/bytes/buffer.go:211 +0x98
io.copyBuffer({0xd3e27e8, 0xc0014e2480}, {0xd3e26a8, 0xc0014e4090}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:415 +0x151
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os.genericWriteTo(0xe8a7b80?, {0xd3e27e8, 0xc0014e2480})
	/usr/local/go/src/os/file.go:269 +0x58
os.(*File).WriteTo(0xe935d10?, {0xd3e27e8?, 0xc0014e2480?})
	/usr/local/go/src/os/file.go:247 +0x49
io.copyBuffer({0xd3e27e8, 0xc0014e2480}, {0xd3e2768, 0xc0015121c0}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:411 +0x9d
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os/exec.(*Cmd).writerDescriptor.func1()
	/usr/local/go/src/os/exec/exec.go:578 +0x34
os/exec.(*Cmd).Start.func2(0xc00162c100?)
	/usr/local/go/src/os/exec/exec.go:728 +0x2c
created by os/exec.(*Cmd).Start in goroutine 4207
	/usr/local/go/src/os/exec/exec.go:727 +0x9ae

                                                
                                                
goroutine 3309 [chan receive, 8 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc001768480, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:150 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3304
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cache.go:122 +0x585

                                                
                                                
goroutine 3444 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xd40a010, 0xc00014a060}, 0xc001526750, 0xc001526798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xd40a010, 0xc00014a060}, 0xa0?, 0xc001526750, 0xc001526798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xd40a010?, 0xc00014a060?}, 0xc001e0c820?, 0xa019540?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc0015267d0?, 0xa05f844?, 0xc0015d94a0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3430
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:145 +0x29a

                                                
                                                
goroutine 3201 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3184
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3105 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3072
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3183 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0xc001f170d0, 0x10)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc00146cd80?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0xd424120)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/queue.go:282 +0x98
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc001f17100)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:159 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:154
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc00084e5a0, {0xd3e3ce0, 0xc0004fb6b0}, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc00084e5a0, 0x3b9aca00, 0x0, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3189
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:143 +0x1ef

                                                
                                                
goroutine 2917 [chan receive, 38 minutes]:
testing.(*testContext).waitParallel(0xc0008158b0)
	/usr/local/go/src/testing/testing.go:1817 +0xac
testing.(*T).Parallel(0xc000c3a9c0)
	/usr/local/go/src/testing/testing.go:1484 +0x229
k8s.io/minikube/test/integration.MaybeParallel(0xc000c3a9c0)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/helpers_test.go:484 +0x34
k8s.io/minikube/test/integration.TestStartStop.func1.1(0xc000c3a9c0)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:94 +0x45
testing.tRunner(0xc000c3a9c0, 0xc001f16380)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 2912
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 3530 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xd40a010, 0xc00014a060}, 0xc001525750, 0xc001525798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xd40a010, 0xc00014a060}, 0xc0?, 0xc001525750, 0xc001525798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xd40a010?, 0xc00014a060?}, 0xc0006a0230?, 0xc0006a0230?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc0015257d0?, 0xa05f844?, 0xc001ea9590?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3545
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:145 +0x29a

                                                
                                                
goroutine 2579 [chan receive, 38 minutes]:
testing.(*T).Run(0xc001734d00, {0xbc75fa6?, 0xa018c13?}, 0xd3d5b68)
	/usr/local/go/src/testing/testing.go:1750 +0x3ab
k8s.io/minikube/test/integration.TestStartStop(0xc001734d00)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:46 +0x35
testing.tRunner(0xc001734d00, 0xd3d5a08)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 1
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 3544 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0xd4006c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:304 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 3543
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:141 +0x238

                                                
                                                
goroutine 3545 [chan receive, 7 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc000c06c00, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:150 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3543
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cache.go:122 +0x585

                                                
                                                
goroutine 3322 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xd40a010, 0xc00014a060}, 0xc000504750, 0xc000504798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xd40a010, 0xc00014a060}, 0xa0?, 0xc000504750, 0xc000504798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xd40a010?, 0xc00014a060?}, 0xc000c3ad00?, 0xa019540?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc0005047d0?, 0xa05f844?, 0xc0015d85a0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3309
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:145 +0x29a

                                                
                                                
goroutine 3072 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xd40a010, 0xc00014a060}, 0xc000b7df50, 0xc000b7df98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xd40a010, 0xc00014a060}, 0x0?, 0xc000b7df50, 0xc000b7df98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xd40a010?, 0xc00014a060?}, 0x0?, 0x0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xa51fb25?, 0xc0018dac60?, 0xd4006c0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3089
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:145 +0x29a

                                                
                                                
goroutine 3088 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0xd4006c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:304 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 3067
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:141 +0x238

                                                
                                                
goroutine 3189 [chan receive, 9 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc001f17100, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:150 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3187
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cache.go:122 +0x585

                                                
                                                
goroutine 4168 [syscall, 2 minutes]:
syscall.syscall6(0xc001f13f80?, 0x1000000000010?, 0x10000000019?, 0x562f0478?, 0x90?, 0xf432108?, 0x90?)
	/usr/local/go/src/runtime/sys_darwin.go:45 +0x98
syscall.wait4(0xc001540b48?, 0x9ee60c5?, 0x90?, 0xd33f280?)
	/usr/local/go/src/syscall/zsyscall_darwin_amd64.go:44 +0x45
syscall.Wait4(0xa016885?, 0xc001540b7c, 0x0?, 0x0?)
	/usr/local/go/src/syscall/syscall_bsd.go:144 +0x25
os.(*Process).wait(0xc001ad4900)
	/usr/local/go/src/os/exec_unix.go:43 +0x6d
os.(*Process).Wait(...)
	/usr/local/go/src/os/exec.go:134
os/exec.(*Cmd).Wait(0xc001833380)
	/usr/local/go/src/os/exec/exec.go:901 +0x45
os/exec.(*Cmd).Run(0xc001833380)
	/usr/local/go/src/os/exec/exec.go:608 +0x2d
k8s.io/minikube/test/integration.Run(0xc000c3b380, 0xc001833380)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/helpers_test.go:104 +0x1e5
k8s.io/minikube/test/integration.validateSecondStart({0xd409e50, 0xc00046e000}, 0xc000c3b380, {0xc00180e1e0, 0x11}, {0x305bd64800507f58?, 0xc000507f60?}, {0xa018c13?, 0x9f70c6f?}, {0xc001b04600, ...})
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:256 +0xe5
k8s.io/minikube/test/integration.TestStartStop.func1.1.1.1(0xc000c3b380)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:156 +0x66
testing.tRunner(0xc000c3b380, 0xc001916f80)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 4093
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 3445 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3444
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3644 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0xc001f16650, 0xd)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc0014cfd80?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0xd424120)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/queue.go:282 +0x98
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc001f16680)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:159 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:154
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc000990660, {0xd3e3ce0, 0xc0008e5e90}, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc000990660, 0x3b9aca00, 0x0, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3661
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:143 +0x1ef

                                                
                                                
goroutine 4120 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0xd4006c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:304 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 4116
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:141 +0x238

                                                
                                                
goroutine 3184 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xd40a010, 0xc00014a060}, 0xc001525f50, 0xc001525f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xd40a010, 0xc00014a060}, 0x40?, 0xc001525f50, 0xc001525f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xd40a010?, 0xc00014a060?}, 0xc001525fb0?, 0xa2886f8?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc001525fd0?, 0xa05f844?, 0xc001906240?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3189
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:145 +0x29a

                                                
                                                
goroutine 4178 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 4177
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 4160 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0xc001769c90, 0x0)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc001522580?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0xd424120)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/queue.go:282 +0x98
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc001769cc0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:159 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:154
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0017c2660, {0xd3e3ce0, 0xc001643230}, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0017c2660, 0x3b9aca00, 0x0, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 4174
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:143 +0x1ef

                                                
                                                
goroutine 3661 [chan receive, 5 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc001f16680, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:150 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3659
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cache.go:122 +0x585

                                                
                                                
goroutine 3799 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3798
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3660 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0xd4006c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:304 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 3659
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:141 +0x238

                                                
                                                
goroutine 3798 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xd40a010, 0xc00014a060}, 0xc001e1df50, 0xc001e1df98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xd40a010, 0xc00014a060}, 0xc0?, 0xc001e1df50, 0xc001e1df98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xd40a010?, 0xc00014a060?}, 0xa496876?, 0xc001ad6300?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xa05f7e5?, 0xc0016fef00?, 0xc0019e4fc0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3789
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:145 +0x29a

                                                
                                                
goroutine 4005 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 4004
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3905 [chan receive, 3 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc000908ec0, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:150 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3884
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cache.go:122 +0x585

                                                
                                                
goroutine 3903 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3902
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3797 [sync.Cond.Wait, 5 minutes]:
sync.runtime_notifyListWait(0xc001769210, 0x0)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc0018fcd80?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0xd424120)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/queue.go:282 +0x98
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc001769240)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:159 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:154
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc001828ee0, {0xd3e3ce0, 0xc0016b3aa0}, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc001828ee0, 0x3b9aca00, 0x0, 0x1, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3789
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:143 +0x1ef

                                                
                                                
goroutine 3888 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0xd4006c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:304 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 3884
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:141 +0x238

                                                
                                                
goroutine 3788 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0xd4006c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:304 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 3784
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:141 +0x238

                                                
                                                
goroutine 4173 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0xd4006c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:304 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 4156
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/util/workqueue/delaying_queue.go:141 +0x238

                                                
                                                
goroutine 4004 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xd40a010, 0xc00014a060}, 0xc000506f50, 0xc000506f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xd40a010, 0xc00014a060}, 0xc0?, 0xc000506f50, 0xc000506f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xd40a010?, 0xc00014a060?}, 0xc000c3b380?, 0xa019540?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xa05f7e5?, 0xc0016fe480?, 0xc0000595c0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.31.0/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 4027
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:145 +0x29a

                                                
                                                
goroutine 4027 [chan receive, 3 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc001f174c0, 0xc00014a060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cert_rotation.go:150 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 4022
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.31.0/transport/cache.go:122 +0x585

                                                
                                                
goroutine 4171 [select, 2 minutes]:
os/exec.(*Cmd).watchCtx(0xc001833380, 0xc0019e5680)
	/usr/local/go/src/os/exec/exec.go:768 +0xb5
created by os/exec.(*Cmd).Start in goroutine 4168
	/usr/local/go/src/os/exec/exec.go:754 +0x976

                                                
                                                
goroutine 4226 [select]:
os/exec.(*Cmd).watchCtx(0xc001714180, 0xc001e54120)
	/usr/local/go/src/os/exec/exec.go:768 +0xb5
created by os/exec.(*Cmd).Start in goroutine 4207
	/usr/local/go/src/os/exec/exec.go:754 +0x976

                                                
                                    

Test pass (177/220)

Order passed test Duration
3 TestDownloadOnly/v1.20.0/json-events 14.32
4 TestDownloadOnly/v1.20.0/preload-exists 0
7 TestDownloadOnly/v1.20.0/kubectl 0
8 TestDownloadOnly/v1.20.0/LogsDuration 0.3
9 TestDownloadOnly/v1.20.0/DeleteAll 0.23
10 TestDownloadOnly/v1.20.0/DeleteAlwaysSucceeds 0.21
12 TestDownloadOnly/v1.31.0/json-events 7.46
13 TestDownloadOnly/v1.31.0/preload-exists 0
16 TestDownloadOnly/v1.31.0/kubectl 0
17 TestDownloadOnly/v1.31.0/LogsDuration 0.29
18 TestDownloadOnly/v1.31.0/DeleteAll 0.23
19 TestDownloadOnly/v1.31.0/DeleteAlwaysSucceeds 0.21
21 TestBinaryMirror 0.95
25 TestAddons/PreSetup/EnablingAddonOnNonExistingCluster 0.19
26 TestAddons/PreSetup/DisablingAddonOnNonExistingCluster 0.21
27 TestAddons/Setup 227.78
29 TestAddons/serial/Volcano 40.65
31 TestAddons/serial/GCPAuth/Namespaces 0.1
34 TestAddons/parallel/Ingress 20.35
35 TestAddons/parallel/InspektorGadget 10.52
36 TestAddons/parallel/MetricsServer 5.49
37 TestAddons/parallel/HelmTiller 9.52
39 TestAddons/parallel/CSI 41.86
40 TestAddons/parallel/Headlamp 18.41
41 TestAddons/parallel/CloudSpanner 5.35
42 TestAddons/parallel/LocalPath 52.48
43 TestAddons/parallel/NvidiaDevicePlugin 5.35
44 TestAddons/parallel/Yakd 10.44
45 TestAddons/StoppedEnableDisable 5.92
53 TestHyperKitDriverInstallOrUpdate 8.13
56 TestErrorSpam/setup 39.5
57 TestErrorSpam/start 1.77
58 TestErrorSpam/status 0.51
59 TestErrorSpam/pause 1.38
60 TestErrorSpam/unpause 1.34
61 TestErrorSpam/stop 153.82
64 TestFunctional/serial/CopySyncFile 0
65 TestFunctional/serial/StartWithProxy 160.95
66 TestFunctional/serial/AuditLog 0
67 TestFunctional/serial/SoftStart 40.19
68 TestFunctional/serial/KubeContext 0.04
69 TestFunctional/serial/KubectlGetPods 0.07
72 TestFunctional/serial/CacheCmd/cache/add_remote 3.03
73 TestFunctional/serial/CacheCmd/cache/add_local 1.33
74 TestFunctional/serial/CacheCmd/cache/CacheDelete 0.08
75 TestFunctional/serial/CacheCmd/cache/list 0.08
76 TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node 0.17
77 TestFunctional/serial/CacheCmd/cache/cache_reload 1.02
78 TestFunctional/serial/CacheCmd/cache/delete 0.16
79 TestFunctional/serial/MinikubeKubectlCmd 1.2
80 TestFunctional/serial/MinikubeKubectlCmdDirectly 1.56
81 TestFunctional/serial/ExtraConfig 39.26
82 TestFunctional/serial/ComponentHealth 0.06
83 TestFunctional/serial/LogsCmd 2.76
84 TestFunctional/serial/LogsFileCmd 2.86
85 TestFunctional/serial/InvalidService 4.5
87 TestFunctional/parallel/ConfigCmd 0.52
88 TestFunctional/parallel/DashboardCmd 13.14
89 TestFunctional/parallel/DryRun 1.8
90 TestFunctional/parallel/InternationalLanguage 0.63
91 TestFunctional/parallel/StatusCmd 0.51
95 TestFunctional/parallel/ServiceCmdConnect 12.36
96 TestFunctional/parallel/AddonsCmd 0.24
97 TestFunctional/parallel/PersistentVolumeClaim 26.43
99 TestFunctional/parallel/SSHCmd 0.31
100 TestFunctional/parallel/CpCmd 0.96
101 TestFunctional/parallel/MySQL 25.55
102 TestFunctional/parallel/FileSync 0.15
103 TestFunctional/parallel/CertSync 0.93
107 TestFunctional/parallel/NodeLabels 0.05
109 TestFunctional/parallel/NonActiveRuntimeDisabled 0.2
111 TestFunctional/parallel/License 0.47
113 TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel 0.37
114 TestFunctional/parallel/TunnelCmd/serial/StartTunnel 0.02
116 TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup 12.13
117 TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP 0.05
118 TestFunctional/parallel/TunnelCmd/serial/AccessDirect 0.02
119 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig 0.04
120 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0.03
121 TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS 0.02
122 TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel 0.13
123 TestFunctional/parallel/ServiceCmd/DeployApp 7.14
124 TestFunctional/parallel/ProfileCmd/profile_not_create 0.25
125 TestFunctional/parallel/ProfileCmd/profile_list 0.25
126 TestFunctional/parallel/ProfileCmd/profile_json_output 0.25
127 TestFunctional/parallel/MountCmd/any-port 6.16
128 TestFunctional/parallel/ServiceCmd/List 0.39
129 TestFunctional/parallel/ServiceCmd/JSONOutput 0.38
130 TestFunctional/parallel/ServiceCmd/HTTPS 0.29
131 TestFunctional/parallel/MountCmd/specific-port 1.62
132 TestFunctional/parallel/ServiceCmd/Format 0.33
133 TestFunctional/parallel/ServiceCmd/URL 0.27
134 TestFunctional/parallel/MountCmd/VerifyCleanup 1.99
135 TestFunctional/parallel/Version/short 0.1
136 TestFunctional/parallel/Version/components 0.5
137 TestFunctional/parallel/ImageCommands/ImageListShort 0.19
138 TestFunctional/parallel/ImageCommands/ImageListTable 0.16
139 TestFunctional/parallel/ImageCommands/ImageListJson 0.15
140 TestFunctional/parallel/ImageCommands/ImageListYaml 0.2
141 TestFunctional/parallel/ImageCommands/ImageBuild 2.67
142 TestFunctional/parallel/ImageCommands/Setup 1.7
143 TestFunctional/parallel/ImageCommands/ImageLoadDaemon 0.9
144 TestFunctional/parallel/ImageCommands/ImageReloadDaemon 0.6
145 TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon 1.4
146 TestFunctional/parallel/ImageCommands/ImageSaveToFile 0.29
147 TestFunctional/parallel/ImageCommands/ImageRemove 0.33
148 TestFunctional/parallel/ImageCommands/ImageLoadFromFile 0.6
149 TestFunctional/parallel/ImageCommands/ImageSaveDaemon 0.41
150 TestFunctional/parallel/DockerEnv/bash 0.59
151 TestFunctional/parallel/UpdateContextCmd/no_changes 0.24
152 TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster 0.21
153 TestFunctional/parallel/UpdateContextCmd/no_clusters 0.19
154 TestFunctional/delete_echo-server_images 0.04
155 TestFunctional/delete_my-image_image 0.02
156 TestFunctional/delete_minikube_cached_images 0.02
160 TestMultiControlPlane/serial/StartCluster 190.45
161 TestMultiControlPlane/serial/DeployApp 6.36
162 TestMultiControlPlane/serial/PingHostFromPods 1.32
164 TestMultiControlPlane/serial/NodeLabels 0.05
165 TestMultiControlPlane/serial/HAppyAfterClusterStart 0.35
168 TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop 0.27
170 TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart 0.34
173 TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete 0.26
174 TestMultiControlPlane/serial/StopCluster 18.99
176 TestMultiControlPlane/serial/DegradedAfterClusterRestart 0.25
178 TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd 0.33
181 TestImageBuild/serial/Setup 37.94
182 TestImageBuild/serial/NormalBuild 1.59
183 TestImageBuild/serial/BuildWithBuildArg 0.78
184 TestImageBuild/serial/BuildWithDockerIgnore 0.75
185 TestImageBuild/serial/BuildWithSpecifiedDockerfile 0.63
189 TestJSONOutput/start/Command 75.05
190 TestJSONOutput/start/Audit 0
192 TestJSONOutput/start/parallel/DistinctCurrentSteps 0
193 TestJSONOutput/start/parallel/IncreasingCurrentSteps 0
195 TestJSONOutput/pause/Command 0.5
196 TestJSONOutput/pause/Audit 0
198 TestJSONOutput/pause/parallel/DistinctCurrentSteps 0
199 TestJSONOutput/pause/parallel/IncreasingCurrentSteps 0
201 TestJSONOutput/unpause/Command 0.45
202 TestJSONOutput/unpause/Audit 0
204 TestJSONOutput/unpause/parallel/DistinctCurrentSteps 0
205 TestJSONOutput/unpause/parallel/IncreasingCurrentSteps 0
207 TestJSONOutput/stop/Command 8.33
208 TestJSONOutput/stop/Audit 0
210 TestJSONOutput/stop/parallel/DistinctCurrentSteps 0
211 TestJSONOutput/stop/parallel/IncreasingCurrentSteps 0
212 TestErrorJSONOutput 0.58
217 TestMainNoArgs 0.08
225 TestMultiNode/serial/FreshStart2Nodes 106.52
226 TestMultiNode/serial/DeployApp2Nodes 5.14
227 TestMultiNode/serial/PingHostFrom2Pods 0.9
228 TestMultiNode/serial/AddNode 45.76
229 TestMultiNode/serial/MultiNodeLabels 0.05
230 TestMultiNode/serial/ProfileList 0.17
231 TestMultiNode/serial/CopyFile 5.24
232 TestMultiNode/serial/StopNode 2.84
233 TestMultiNode/serial/StartAfterStop 41.67
238 TestMultiNode/serial/ValidateNameConflict 43.85
242 TestPreload 139.59
245 TestSkaffold 109.83
248 TestRunningBinaryUpgrade 87.79
250 TestKubernetesUpgrade 119.61
263 TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current 3.28
264 TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current 6.56
272 TestStoppedBinaryUpgrade/Setup 1.67
273 TestStoppedBinaryUpgrade/Upgrade 1339.51
274 TestStoppedBinaryUpgrade/MinikubeLogs 2.84
276 TestNoKubernetes/serial/StartNoK8sWithVersion 0.66
277 TestNoKubernetes/serial/StartWithK8s 41.38
278 TestNoKubernetes/serial/StartWithStopK8s 9.28
281 TestNoKubernetes/serial/Start 78.99
282 TestNoKubernetes/serial/VerifyK8sNotRunning 0.13
283 TestNoKubernetes/serial/ProfileList 0.37
284 TestNoKubernetes/serial/Stop 2.36
285 TestNoKubernetes/serial/StartNoArgs 75.57
287 TestNoKubernetes/serial/VerifyK8sNotRunningSecond 0.12
x
+
TestDownloadOnly/v1.20.0/json-events (14.32s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/json-events
aaa_download_only_test.go:81: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-798000 --force --alsologtostderr --kubernetes-version=v1.20.0 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:81: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-798000 --force --alsologtostderr --kubernetes-version=v1.20.0 --container-runtime=docker --driver=hyperkit : (14.316501379s)
--- PASS: TestDownloadOnly/v1.20.0/json-events (14.32s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/preload-exists
--- PASS: TestDownloadOnly/v1.20.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/kubectl
--- PASS: TestDownloadOnly/v1.20.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/LogsDuration (0.3s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/LogsDuration
aaa_download_only_test.go:184: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-798000
aaa_download_only_test.go:184: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-798000: exit status 85 (300.313385ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| Command |              Args              |       Profile        |  User   | Version |     Start Time      | End Time |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| start   | -o=json --download-only        | download-only-798000 | jenkins | v1.33.1 | 31 Aug 24 15:05 PDT |          |
	|         | -p download-only-798000        |                      |         |         |                     |          |
	|         | --force --alsologtostderr      |                      |         |         |                     |          |
	|         | --kubernetes-version=v1.20.0   |                      |         |         |                     |          |
	|         | --container-runtime=docker     |                      |         |         |                     |          |
	|         | --driver=hyperkit              |                      |         |         |                     |          |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	
	
	==> Last Start <==
	Log file created at: 2024/08/31 15:05:02
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0831 15:05:02.824657    1485 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:05:02.824849    1485 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:05:02.824854    1485 out.go:358] Setting ErrFile to fd 2...
	I0831 15:05:02.824863    1485 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:05:02.825022    1485 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	W0831 15:05:02.825124    1485 root.go:314] Error reading config file at /Users/jenkins/minikube-integration/18943-957/.minikube/config/config.json: open /Users/jenkins/minikube-integration/18943-957/.minikube/config/config.json: no such file or directory
	I0831 15:05:02.826854    1485 out.go:352] Setting JSON to true
	I0831 15:05:02.849643    1485 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":273,"bootTime":1725141629,"procs":412,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.6.1","kernelVersion":"23.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0831 15:05:02.849871    1485 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0831 15:05:02.871216    1485 out.go:97] [download-only-798000] minikube v1.33.1 on Darwin 14.6.1
	I0831 15:05:02.871441    1485 notify.go:220] Checking for updates...
	W0831 15:05:02.871450    1485 preload.go:293] Failed to list preload files: open /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball: no such file or directory
	I0831 15:05:02.892843    1485 out.go:169] MINIKUBE_LOCATION=18943
	I0831 15:05:02.915939    1485 out.go:169] KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:05:02.937942    1485 out.go:169] MINIKUBE_BIN=out/minikube-darwin-amd64
	I0831 15:05:02.959087    1485 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0831 15:05:02.979835    1485 out.go:169] MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	W0831 15:05:03.021837    1485 out.go:321] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I0831 15:05:03.022293    1485 driver.go:392] Setting default libvirt URI to qemu:///system
	I0831 15:05:03.071817    1485 out.go:97] Using the hyperkit driver based on user configuration
	I0831 15:05:03.071879    1485 start.go:297] selected driver: hyperkit
	I0831 15:05:03.071896    1485 start.go:901] validating driver "hyperkit" against <nil>
	I0831 15:05:03.072124    1485 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:05:03.072487    1485 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18943-957/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0831 15:05:03.469554    1485 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0831 15:05:03.474050    1485 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:05:03.474072    1485 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0831 15:05:03.474103    1485 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0831 15:05:03.478746    1485 start_flags.go:393] Using suggested 6000MB memory alloc based on sys=32768MB, container=0MB
	I0831 15:05:03.478928    1485 start_flags.go:929] Wait components to verify : map[apiserver:true system_pods:true]
	I0831 15:05:03.478989    1485 cni.go:84] Creating CNI manager for ""
	I0831 15:05:03.479003    1485 cni.go:162] CNI unnecessary in this configuration, recommending no CNI
	I0831 15:05:03.479074    1485 start.go:340] cluster config:
	{Name:download-only-798000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:6000 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.20.0 ClusterName:download-only-798000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local
ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.20.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:05:03.479285    1485 iso.go:125] acquiring lock: {Name:mk6e91575b208577856769ef01f8e000bc57c787 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:05:03.500123    1485 out.go:97] Downloading VM boot image ...
	I0831 15:05:03.500192    1485 download.go:107] Downloading: https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso?checksum=file:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso.sha256 -> /Users/jenkins/minikube-integration/18943-957/.minikube/cache/iso/amd64/minikube-v1.33.1-1724862017-19530-amd64.iso
	I0831 15:05:09.068570    1485 out.go:97] Starting "download-only-798000" primary control-plane node in "download-only-798000" cluster
	I0831 15:05:09.068611    1485 preload.go:131] Checking if preload exists for k8s version v1.20.0 and runtime docker
	I0831 15:05:09.123826    1485 preload.go:118] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.20.0/preloaded-images-k8s-v18-v1.20.0-docker-overlay2-amd64.tar.lz4
	I0831 15:05:09.123856    1485 cache.go:56] Caching tarball of preloaded images
	I0831 15:05:09.124191    1485 preload.go:131] Checking if preload exists for k8s version v1.20.0 and runtime docker
	I0831 15:05:09.144709    1485 out.go:97] Downloading Kubernetes v1.20.0 preload ...
	I0831 15:05:09.144737    1485 preload.go:236] getting checksum for preloaded-images-k8s-v18-v1.20.0-docker-overlay2-amd64.tar.lz4 ...
	I0831 15:05:09.220441    1485 download.go:107] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.20.0/preloaded-images-k8s-v18-v1.20.0-docker-overlay2-amd64.tar.lz4?checksum=md5:9a82241e9b8b4ad2b5cca73108f2c7a3 -> /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.20.0-docker-overlay2-amd64.tar.lz4
	
	
	* The control-plane node download-only-798000 host does not exist
	  To start a cluster, run: "minikube start -p download-only-798000"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:185: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.20.0/LogsDuration (0.30s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/DeleteAll (0.23s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/DeleteAll
aaa_download_only_test.go:197: (dbg) Run:  out/minikube-darwin-amd64 delete --all
--- PASS: TestDownloadOnly/v1.20.0/DeleteAll (0.23s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/DeleteAlwaysSucceeds (0.21s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/DeleteAlwaysSucceeds
aaa_download_only_test.go:208: (dbg) Run:  out/minikube-darwin-amd64 delete -p download-only-798000
--- PASS: TestDownloadOnly/v1.20.0/DeleteAlwaysSucceeds (0.21s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0/json-events (7.46s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0/json-events
aaa_download_only_test.go:81: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-982000 --force --alsologtostderr --kubernetes-version=v1.31.0 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:81: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-982000 --force --alsologtostderr --kubernetes-version=v1.31.0 --container-runtime=docker --driver=hyperkit : (7.462519855s)
--- PASS: TestDownloadOnly/v1.31.0/json-events (7.46s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0/preload-exists
--- PASS: TestDownloadOnly/v1.31.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0/kubectl
--- PASS: TestDownloadOnly/v1.31.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0/LogsDuration (0.29s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0/LogsDuration
aaa_download_only_test.go:184: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-982000
aaa_download_only_test.go:184: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-982000: exit status 85 (292.580828ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| Command |              Args              |       Profile        |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| start   | -o=json --download-only        | download-only-798000 | jenkins | v1.33.1 | 31 Aug 24 15:05 PDT |                     |
	|         | -p download-only-798000        |                      |         |         |                     |                     |
	|         | --force --alsologtostderr      |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.20.0   |                      |         |         |                     |                     |
	|         | --container-runtime=docker     |                      |         |         |                     |                     |
	|         | --driver=hyperkit              |                      |         |         |                     |                     |
	| delete  | --all                          | minikube             | jenkins | v1.33.1 | 31 Aug 24 15:05 PDT | 31 Aug 24 15:05 PDT |
	| delete  | -p download-only-798000        | download-only-798000 | jenkins | v1.33.1 | 31 Aug 24 15:05 PDT | 31 Aug 24 15:05 PDT |
	| start   | -o=json --download-only        | download-only-982000 | jenkins | v1.33.1 | 31 Aug 24 15:05 PDT |                     |
	|         | -p download-only-982000        |                      |         |         |                     |                     |
	|         | --force --alsologtostderr      |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.31.0   |                      |         |         |                     |                     |
	|         | --container-runtime=docker     |                      |         |         |                     |                     |
	|         | --driver=hyperkit              |                      |         |         |                     |                     |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/08/31 15:05:17
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0831 15:05:17.884814    1509 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:05:17.884974    1509 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:05:17.884979    1509 out.go:358] Setting ErrFile to fd 2...
	I0831 15:05:17.884983    1509 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:05:17.885159    1509 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:05:17.886525    1509 out.go:352] Setting JSON to true
	I0831 15:05:17.908383    1509 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":288,"bootTime":1725141629,"procs":412,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.6.1","kernelVersion":"23.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0831 15:05:17.908471    1509 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0831 15:05:17.929710    1509 out.go:97] [download-only-982000] minikube v1.33.1 on Darwin 14.6.1
	I0831 15:05:17.929912    1509 notify.go:220] Checking for updates...
	I0831 15:05:17.951527    1509 out.go:169] MINIKUBE_LOCATION=18943
	I0831 15:05:17.972735    1509 out.go:169] KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:05:17.994857    1509 out.go:169] MINIKUBE_BIN=out/minikube-darwin-amd64
	I0831 15:05:18.016490    1509 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0831 15:05:18.039623    1509 out.go:169] MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	W0831 15:05:18.081364    1509 out.go:321] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I0831 15:05:18.081839    1509 driver.go:392] Setting default libvirt URI to qemu:///system
	I0831 15:05:18.112593    1509 out.go:97] Using the hyperkit driver based on user configuration
	I0831 15:05:18.112666    1509 start.go:297] selected driver: hyperkit
	I0831 15:05:18.112680    1509 start.go:901] validating driver "hyperkit" against <nil>
	I0831 15:05:18.112908    1509 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:05:18.113176    1509 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18943-957/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0831 15:05:18.122911    1509 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0831 15:05:18.127689    1509 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:05:18.127710    1509 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0831 15:05:18.127736    1509 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0831 15:05:18.130363    1509 start_flags.go:393] Using suggested 6000MB memory alloc based on sys=32768MB, container=0MB
	I0831 15:05:18.130502    1509 start_flags.go:929] Wait components to verify : map[apiserver:true system_pods:true]
	I0831 15:05:18.130530    1509 cni.go:84] Creating CNI manager for ""
	I0831 15:05:18.130543    1509 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0831 15:05:18.130552    1509 start_flags.go:319] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0831 15:05:18.130616    1509 start.go:340] cluster config:
	{Name:download-only-982000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:6000 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0 ClusterName:download-only-982000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local
ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:05:18.130705    1509 iso.go:125] acquiring lock: {Name:mk6e91575b208577856769ef01f8e000bc57c787 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0831 15:05:18.151615    1509 out.go:97] Starting "download-only-982000" primary control-plane node in "download-only-982000" cluster
	I0831 15:05:18.151649    1509 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:05:18.220522    1509 preload.go:118] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.31.0/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4
	I0831 15:05:18.220605    1509 cache.go:56] Caching tarball of preloaded images
	I0831 15:05:18.221090    1509 preload.go:131] Checking if preload exists for k8s version v1.31.0 and runtime docker
	I0831 15:05:18.242793    1509 out.go:97] Downloading Kubernetes v1.31.0 preload ...
	I0831 15:05:18.242820    1509 preload.go:236] getting checksum for preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4 ...
	I0831 15:05:18.320976    1509 download.go:107] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.31.0/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4?checksum=md5:2dd98f97b896d7a4f012ee403b477cc8 -> /Users/jenkins/minikube-integration/18943-957/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-docker-overlay2-amd64.tar.lz4
	
	
	* The control-plane node download-only-982000 host does not exist
	  To start a cluster, run: "minikube start -p download-only-982000"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:185: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.31.0/LogsDuration (0.29s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0/DeleteAll (0.23s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0/DeleteAll
aaa_download_only_test.go:197: (dbg) Run:  out/minikube-darwin-amd64 delete --all
--- PASS: TestDownloadOnly/v1.31.0/DeleteAll (0.23s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0/DeleteAlwaysSucceeds (0.21s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0/DeleteAlwaysSucceeds
aaa_download_only_test.go:208: (dbg) Run:  out/minikube-darwin-amd64 delete -p download-only-982000
--- PASS: TestDownloadOnly/v1.31.0/DeleteAlwaysSucceeds (0.21s)

                                                
                                    
x
+
TestBinaryMirror (0.95s)

                                                
                                                
=== RUN   TestBinaryMirror
aaa_download_only_test.go:314: (dbg) Run:  out/minikube-darwin-amd64 start --download-only -p binary-mirror-866000 --alsologtostderr --binary-mirror http://127.0.0.1:49637 --driver=hyperkit 
helpers_test.go:176: Cleaning up "binary-mirror-866000" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 delete -p binary-mirror-866000
--- PASS: TestBinaryMirror (0.95s)

                                                
                                    
x
+
TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.19s)

                                                
                                                
=== RUN   TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/EnablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
addons_test.go:1037: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p addons-540000
addons_test.go:1037: (dbg) Non-zero exit: out/minikube-darwin-amd64 addons enable dashboard -p addons-540000: exit status 85 (189.125583ms)

                                                
                                                
-- stdout --
	* Profile "addons-540000" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-540000"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.19s)

                                                
                                    
x
+
TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.21s)

                                                
                                                
=== RUN   TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/DisablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
addons_test.go:1048: (dbg) Run:  out/minikube-darwin-amd64 addons disable dashboard -p addons-540000
addons_test.go:1048: (dbg) Non-zero exit: out/minikube-darwin-amd64 addons disable dashboard -p addons-540000: exit status 85 (209.949783ms)

                                                
                                                
-- stdout --
	* Profile "addons-540000" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-540000"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.21s)

                                                
                                    
x
+
TestAddons/Setup (227.78s)

                                                
                                                
=== RUN   TestAddons/Setup
addons_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 start -p addons-540000 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=storage-provisioner-rancher --addons=nvidia-device-plugin --addons=yakd --addons=volcano --driver=hyperkit  --addons=ingress --addons=ingress-dns --addons=helm-tiller
addons_test.go:110: (dbg) Done: out/minikube-darwin-amd64 start -p addons-540000 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=storage-provisioner-rancher --addons=nvidia-device-plugin --addons=yakd --addons=volcano --driver=hyperkit  --addons=ingress --addons=ingress-dns --addons=helm-tiller: (3m47.784290977s)
--- PASS: TestAddons/Setup (227.78s)

                                                
                                    
x
+
TestAddons/serial/Volcano (40.65s)

                                                
                                                
=== RUN   TestAddons/serial/Volcano
addons_test.go:897: volcano-scheduler stabilized in 14.413389ms
addons_test.go:905: volcano-admission stabilized in 14.435759ms
addons_test.go:913: volcano-controller stabilized in 14.496163ms
addons_test.go:919: (dbg) TestAddons/serial/Volcano: waiting 6m0s for pods matching "app=volcano-scheduler" in namespace "volcano-system" ...
helpers_test.go:345: "volcano-scheduler-576bc46687-84p4v" [00d7570b-75e3-49cb-b3c1-06d769d5225f] Running
addons_test.go:919: (dbg) TestAddons/serial/Volcano: app=volcano-scheduler healthy within 5.003610095s
addons_test.go:923: (dbg) TestAddons/serial/Volcano: waiting 6m0s for pods matching "app=volcano-admission" in namespace "volcano-system" ...
helpers_test.go:345: "volcano-admission-77d7d48b68-nnmjj" [b997d90c-1273-4b46-b182-eed37f02ca8a] Running
addons_test.go:923: (dbg) TestAddons/serial/Volcano: app=volcano-admission healthy within 5.003376232s
addons_test.go:927: (dbg) TestAddons/serial/Volcano: waiting 6m0s for pods matching "app=volcano-controller" in namespace "volcano-system" ...
helpers_test.go:345: "volcano-controllers-56675bb4d5-c7v8r" [f42997eb-4b84-405a-ab79-e02659613e17] Running
addons_test.go:927: (dbg) TestAddons/serial/Volcano: app=volcano-controller healthy within 5.004905258s
addons_test.go:932: (dbg) Run:  kubectl --context addons-540000 delete -n volcano-system job volcano-admission-init
addons_test.go:938: (dbg) Run:  kubectl --context addons-540000 create -f testdata/vcjob.yaml
addons_test.go:946: (dbg) Run:  kubectl --context addons-540000 get vcjob -n my-volcano
addons_test.go:964: (dbg) TestAddons/serial/Volcano: waiting 3m0s for pods matching "volcano.sh/job-name=test-job" in namespace "my-volcano" ...
helpers_test.go:345: "test-job-nginx-0" [dae77c29-ab12-4929-b5f5-9ad59775697f] Pending
helpers_test.go:345: "test-job-nginx-0" [dae77c29-ab12-4929-b5f5-9ad59775697f] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:345: "test-job-nginx-0" [dae77c29-ab12-4929-b5f5-9ad59775697f] Running
addons_test.go:964: (dbg) TestAddons/serial/Volcano: volcano.sh/job-name=test-job healthy within 15.003940027s
addons_test.go:968: (dbg) Run:  out/minikube-darwin-amd64 -p addons-540000 addons disable volcano --alsologtostderr -v=1
addons_test.go:968: (dbg) Done: out/minikube-darwin-amd64 -p addons-540000 addons disable volcano --alsologtostderr -v=1: (10.33483316s)
--- PASS: TestAddons/serial/Volcano (40.65s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/Namespaces (0.1s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/Namespaces
addons_test.go:656: (dbg) Run:  kubectl --context addons-540000 create ns new-namespace
addons_test.go:670: (dbg) Run:  kubectl --context addons-540000 get secret gcp-auth -n new-namespace
--- PASS: TestAddons/serial/GCPAuth/Namespaces (0.10s)

                                                
                                    
x
+
TestAddons/parallel/Ingress (20.35s)

                                                
                                                
=== RUN   TestAddons/parallel/Ingress
=== PAUSE TestAddons/parallel/Ingress

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:209: (dbg) Run:  kubectl --context addons-540000 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:234: (dbg) Run:  kubectl --context addons-540000 replace --force -f testdata/nginx-ingress-v1.yaml
addons_test.go:247: (dbg) Run:  kubectl --context addons-540000 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:252: (dbg) TestAddons/parallel/Ingress: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:345: "nginx" [25365911-81eb-47e5-a5bd-f6a1878e03b6] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:345: "nginx" [25365911-81eb-47e5-a5bd-f6a1878e03b6] Running
addons_test.go:252: (dbg) TestAddons/parallel/Ingress: run=nginx healthy within 11.005922121s
addons_test.go:264: (dbg) Run:  out/minikube-darwin-amd64 -p addons-540000 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:288: (dbg) Run:  kubectl --context addons-540000 replace --force -f testdata/ingress-dns-example-v1.yaml
addons_test.go:293: (dbg) Run:  out/minikube-darwin-amd64 -p addons-540000 ip
addons_test.go:299: (dbg) Run:  nslookup hello-john.test 192.169.0.2
addons_test.go:308: (dbg) Run:  out/minikube-darwin-amd64 -p addons-540000 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:313: (dbg) Run:  out/minikube-darwin-amd64 -p addons-540000 addons disable ingress --alsologtostderr -v=1
addons_test.go:313: (dbg) Done: out/minikube-darwin-amd64 -p addons-540000 addons disable ingress --alsologtostderr -v=1: (7.444005819s)
--- PASS: TestAddons/parallel/Ingress (20.35s)

                                                
                                    
x
+
TestAddons/parallel/InspektorGadget (10.52s)

                                                
                                                
=== RUN   TestAddons/parallel/InspektorGadget
=== PAUSE TestAddons/parallel/InspektorGadget

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/InspektorGadget
addons_test.go:848: (dbg) TestAddons/parallel/InspektorGadget: waiting 8m0s for pods matching "k8s-app=gadget" in namespace "gadget" ...
helpers_test.go:345: "gadget-9nq9w" [105e19e8-096e-4ce5-9c14-93fc268595e3] Running / Ready:ContainersNotReady (containers with unready status: [gadget]) / ContainersReady:ContainersNotReady (containers with unready status: [gadget])
addons_test.go:848: (dbg) TestAddons/parallel/InspektorGadget: k8s-app=gadget healthy within 5.005494168s
addons_test.go:851: (dbg) Run:  out/minikube-darwin-amd64 addons disable inspektor-gadget -p addons-540000
addons_test.go:851: (dbg) Done: out/minikube-darwin-amd64 addons disable inspektor-gadget -p addons-540000: (5.513464583s)
--- PASS: TestAddons/parallel/InspektorGadget (10.52s)

                                                
                                    
x
+
TestAddons/parallel/MetricsServer (5.49s)

                                                
                                                
=== RUN   TestAddons/parallel/MetricsServer
=== PAUSE TestAddons/parallel/MetricsServer

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/MetricsServer
addons_test.go:409: metrics-server stabilized in 1.549723ms
addons_test.go:411: (dbg) TestAddons/parallel/MetricsServer: waiting 6m0s for pods matching "k8s-app=metrics-server" in namespace "kube-system" ...
helpers_test.go:345: "metrics-server-84c5f94fbc-nt8q4" [770f613a-32ba-44c2-bf29-86d02a6c4f18] Running
addons_test.go:411: (dbg) TestAddons/parallel/MetricsServer: k8s-app=metrics-server healthy within 5.004641159s
addons_test.go:417: (dbg) Run:  kubectl --context addons-540000 top pods -n kube-system
addons_test.go:434: (dbg) Run:  out/minikube-darwin-amd64 -p addons-540000 addons disable metrics-server --alsologtostderr -v=1
--- PASS: TestAddons/parallel/MetricsServer (5.49s)

                                                
                                    
x
+
TestAddons/parallel/HelmTiller (9.52s)

                                                
                                                
=== RUN   TestAddons/parallel/HelmTiller
=== PAUSE TestAddons/parallel/HelmTiller

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/HelmTiller
addons_test.go:458: tiller-deploy stabilized in 2.018734ms
addons_test.go:460: (dbg) TestAddons/parallel/HelmTiller: waiting 6m0s for pods matching "app=helm" in namespace "kube-system" ...
helpers_test.go:345: "tiller-deploy-b48cc5f79-fd6zx" [acc8c5ea-8b97-4176-a6fc-526116814954] Running
addons_test.go:460: (dbg) TestAddons/parallel/HelmTiller: app=helm healthy within 5.004879769s
addons_test.go:475: (dbg) Run:  kubectl --context addons-540000 run --rm helm-test --restart=Never --image=docker.io/alpine/helm:2.16.3 -it --namespace=kube-system -- version
addons_test.go:475: (dbg) Done: kubectl --context addons-540000 run --rm helm-test --restart=Never --image=docker.io/alpine/helm:2.16.3 -it --namespace=kube-system -- version: (4.093732949s)
addons_test.go:492: (dbg) Run:  out/minikube-darwin-amd64 -p addons-540000 addons disable helm-tiller --alsologtostderr -v=1
--- PASS: TestAddons/parallel/HelmTiller (9.52s)

                                                
                                    
x
+
TestAddons/parallel/CSI (41.86s)

                                                
                                                
=== RUN   TestAddons/parallel/CSI
=== PAUSE TestAddons/parallel/CSI

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CSI
addons_test.go:567: csi-hostpath-driver pods stabilized in 4.490257ms
addons_test.go:570: (dbg) Run:  kubectl --context addons-540000 create -f testdata/csi-hostpath-driver/pvc.yaml
addons_test.go:575: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc" in namespace "default" ...
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc hpvc -o jsonpath={.status.phase} -n default
addons_test.go:580: (dbg) Run:  kubectl --context addons-540000 create -f testdata/csi-hostpath-driver/pv-pod.yaml
addons_test.go:585: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod" in namespace "default" ...
helpers_test.go:345: "task-pv-pod" [6ad06dd6-6805-4382-b750-fae56cc61348] Pending
helpers_test.go:345: "task-pv-pod" [6ad06dd6-6805-4382-b750-fae56cc61348] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:345: "task-pv-pod" [6ad06dd6-6805-4382-b750-fae56cc61348] Running
addons_test.go:585: (dbg) TestAddons/parallel/CSI: app=task-pv-pod healthy within 7.005696271s
addons_test.go:590: (dbg) Run:  kubectl --context addons-540000 create -f testdata/csi-hostpath-driver/snapshot.yaml
addons_test.go:595: (dbg) TestAddons/parallel/CSI: waiting 6m0s for volume snapshot "new-snapshot-demo" in namespace "default" ...
helpers_test.go:420: (dbg) Run:  kubectl --context addons-540000 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
helpers_test.go:420: (dbg) Run:  kubectl --context addons-540000 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
addons_test.go:600: (dbg) Run:  kubectl --context addons-540000 delete pod task-pv-pod
addons_test.go:606: (dbg) Run:  kubectl --context addons-540000 delete pvc hpvc
addons_test.go:612: (dbg) Run:  kubectl --context addons-540000 create -f testdata/csi-hostpath-driver/pvc-restore.yaml
addons_test.go:617: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc-restore" in namespace "default" ...
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
addons_test.go:622: (dbg) Run:  kubectl --context addons-540000 create -f testdata/csi-hostpath-driver/pv-pod-restore.yaml
addons_test.go:627: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod-restore" in namespace "default" ...
helpers_test.go:345: "task-pv-pod-restore" [90b954f1-fccc-46a9-b483-9f6f35eae428] Pending
helpers_test.go:345: "task-pv-pod-restore" [90b954f1-fccc-46a9-b483-9f6f35eae428] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:345: "task-pv-pod-restore" [90b954f1-fccc-46a9-b483-9f6f35eae428] Running
addons_test.go:627: (dbg) TestAddons/parallel/CSI: app=task-pv-pod-restore healthy within 8.003561216s
addons_test.go:632: (dbg) Run:  kubectl --context addons-540000 delete pod task-pv-pod-restore
addons_test.go:636: (dbg) Run:  kubectl --context addons-540000 delete pvc hpvc-restore
addons_test.go:640: (dbg) Run:  kubectl --context addons-540000 delete volumesnapshot new-snapshot-demo
addons_test.go:644: (dbg) Run:  out/minikube-darwin-amd64 -p addons-540000 addons disable csi-hostpath-driver --alsologtostderr -v=1
addons_test.go:644: (dbg) Done: out/minikube-darwin-amd64 -p addons-540000 addons disable csi-hostpath-driver --alsologtostderr -v=1: (6.412494121s)
addons_test.go:648: (dbg) Run:  out/minikube-darwin-amd64 -p addons-540000 addons disable volumesnapshots --alsologtostderr -v=1
--- PASS: TestAddons/parallel/CSI (41.86s)

                                                
                                    
x
+
TestAddons/parallel/Headlamp (18.41s)

                                                
                                                
=== RUN   TestAddons/parallel/Headlamp
=== PAUSE TestAddons/parallel/Headlamp

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:830: (dbg) Run:  out/minikube-darwin-amd64 addons enable headlamp -p addons-540000 --alsologtostderr -v=1
addons_test.go:835: (dbg) TestAddons/parallel/Headlamp: waiting 8m0s for pods matching "app.kubernetes.io/name=headlamp" in namespace "headlamp" ...
helpers_test.go:345: "headlamp-57fb76fcdb-2ngl2" [573c9421-dc2a-4ad5-a163-2f52b3e56763] Pending / Ready:ContainersNotReady (containers with unready status: [headlamp]) / ContainersReady:ContainersNotReady (containers with unready status: [headlamp])
helpers_test.go:345: "headlamp-57fb76fcdb-2ngl2" [573c9421-dc2a-4ad5-a163-2f52b3e56763] Running
addons_test.go:835: (dbg) TestAddons/parallel/Headlamp: app.kubernetes.io/name=headlamp healthy within 12.005630666s
addons_test.go:839: (dbg) Run:  out/minikube-darwin-amd64 -p addons-540000 addons disable headlamp --alsologtostderr -v=1
addons_test.go:839: (dbg) Done: out/minikube-darwin-amd64 -p addons-540000 addons disable headlamp --alsologtostderr -v=1: (5.465257388s)
--- PASS: TestAddons/parallel/Headlamp (18.41s)

                                                
                                    
x
+
TestAddons/parallel/CloudSpanner (5.35s)

                                                
                                                
=== RUN   TestAddons/parallel/CloudSpanner
=== PAUSE TestAddons/parallel/CloudSpanner

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CloudSpanner
addons_test.go:867: (dbg) TestAddons/parallel/CloudSpanner: waiting 6m0s for pods matching "app=cloud-spanner-emulator" in namespace "default" ...
helpers_test.go:345: "cloud-spanner-emulator-769b77f747-xw9pt" [cf5817d6-b32e-48ee-8780-fbce02d0509e] Running
addons_test.go:867: (dbg) TestAddons/parallel/CloudSpanner: app=cloud-spanner-emulator healthy within 5.004855368s
addons_test.go:870: (dbg) Run:  out/minikube-darwin-amd64 addons disable cloud-spanner -p addons-540000
--- PASS: TestAddons/parallel/CloudSpanner (5.35s)

                                                
                                    
x
+
TestAddons/parallel/LocalPath (52.48s)

                                                
                                                
=== RUN   TestAddons/parallel/LocalPath
=== PAUSE TestAddons/parallel/LocalPath

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/LocalPath
addons_test.go:982: (dbg) Run:  kubectl --context addons-540000 apply -f testdata/storage-provisioner-rancher/pvc.yaml
addons_test.go:988: (dbg) Run:  kubectl --context addons-540000 apply -f testdata/storage-provisioner-rancher/pod.yaml
addons_test.go:992: (dbg) TestAddons/parallel/LocalPath: waiting 5m0s for pvc "test-pvc" in namespace "default" ...
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:395: (dbg) Run:  kubectl --context addons-540000 get pvc test-pvc -o jsonpath={.status.phase} -n default
addons_test.go:995: (dbg) TestAddons/parallel/LocalPath: waiting 3m0s for pods matching "run=test-local-path" in namespace "default" ...
helpers_test.go:345: "test-local-path" [7b9ccc34-5e08-4695-aaad-d3c3812ccb47] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:345: "test-local-path" [7b9ccc34-5e08-4695-aaad-d3c3812ccb47] Pending / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:345: "test-local-path" [7b9ccc34-5e08-4695-aaad-d3c3812ccb47] Succeeded / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
addons_test.go:995: (dbg) TestAddons/parallel/LocalPath: run=test-local-path healthy within 4.00302969s
addons_test.go:1000: (dbg) Run:  kubectl --context addons-540000 get pvc test-pvc -o=json
addons_test.go:1009: (dbg) Run:  out/minikube-darwin-amd64 -p addons-540000 ssh "cat /opt/local-path-provisioner/pvc-4058ab84-8955-4fff-b124-718dab365f42_default_test-pvc/file1"
addons_test.go:1021: (dbg) Run:  kubectl --context addons-540000 delete pod test-local-path
addons_test.go:1025: (dbg) Run:  kubectl --context addons-540000 delete pvc test-pvc
addons_test.go:1029: (dbg) Run:  out/minikube-darwin-amd64 -p addons-540000 addons disable storage-provisioner-rancher --alsologtostderr -v=1
addons_test.go:1029: (dbg) Done: out/minikube-darwin-amd64 -p addons-540000 addons disable storage-provisioner-rancher --alsologtostderr -v=1: (42.826555555s)
--- PASS: TestAddons/parallel/LocalPath (52.48s)

                                                
                                    
x
+
TestAddons/parallel/NvidiaDevicePlugin (5.35s)

                                                
                                                
=== RUN   TestAddons/parallel/NvidiaDevicePlugin
=== PAUSE TestAddons/parallel/NvidiaDevicePlugin

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/NvidiaDevicePlugin
addons_test.go:1061: (dbg) TestAddons/parallel/NvidiaDevicePlugin: waiting 6m0s for pods matching "name=nvidia-device-plugin-ds" in namespace "kube-system" ...
helpers_test.go:345: "nvidia-device-plugin-daemonset-q998b" [2d50fc12-bdb1-49e7-ae12-d5e775633fc0] Running
addons_test.go:1061: (dbg) TestAddons/parallel/NvidiaDevicePlugin: name=nvidia-device-plugin-ds healthy within 5.004130442s
addons_test.go:1064: (dbg) Run:  out/minikube-darwin-amd64 addons disable nvidia-device-plugin -p addons-540000
--- PASS: TestAddons/parallel/NvidiaDevicePlugin (5.35s)

                                                
                                    
x
+
TestAddons/parallel/Yakd (10.44s)

                                                
                                                
=== RUN   TestAddons/parallel/Yakd
=== PAUSE TestAddons/parallel/Yakd

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Yakd
addons_test.go:1072: (dbg) TestAddons/parallel/Yakd: waiting 2m0s for pods matching "app.kubernetes.io/name=yakd-dashboard" in namespace "yakd-dashboard" ...
helpers_test.go:345: "yakd-dashboard-67d98fc6b-f7zvd" [91d2cf15-971e-4eb8-91b0-e2a42dd57397] Running
addons_test.go:1072: (dbg) TestAddons/parallel/Yakd: app.kubernetes.io/name=yakd-dashboard healthy within 5.004526277s
addons_test.go:1076: (dbg) Run:  out/minikube-darwin-amd64 -p addons-540000 addons disable yakd --alsologtostderr -v=1
addons_test.go:1076: (dbg) Done: out/minikube-darwin-amd64 -p addons-540000 addons disable yakd --alsologtostderr -v=1: (5.431991449s)
--- PASS: TestAddons/parallel/Yakd (10.44s)

                                                
                                    
x
+
TestAddons/StoppedEnableDisable (5.92s)

                                                
                                                
=== RUN   TestAddons/StoppedEnableDisable
addons_test.go:174: (dbg) Run:  out/minikube-darwin-amd64 stop -p addons-540000
addons_test.go:174: (dbg) Done: out/minikube-darwin-amd64 stop -p addons-540000: (5.379965003s)
addons_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p addons-540000
addons_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 addons disable dashboard -p addons-540000
addons_test.go:187: (dbg) Run:  out/minikube-darwin-amd64 addons disable gvisor -p addons-540000
--- PASS: TestAddons/StoppedEnableDisable (5.92s)

                                                
                                    
x
+
TestHyperKitDriverInstallOrUpdate (8.13s)

                                                
                                                
=== RUN   TestHyperKitDriverInstallOrUpdate
=== PAUSE TestHyperKitDriverInstallOrUpdate

                                                
                                                

                                                
                                                
=== CONT  TestHyperKitDriverInstallOrUpdate
--- PASS: TestHyperKitDriverInstallOrUpdate (8.13s)

                                                
                                    
x
+
TestErrorSpam/setup (39.5s)

                                                
                                                
=== RUN   TestErrorSpam/setup
error_spam_test.go:81: (dbg) Run:  out/minikube-darwin-amd64 start -p nospam-356000 -n=1 --memory=2250 --wait=false --log_dir=/var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-356000 --driver=hyperkit 
error_spam_test.go:81: (dbg) Done: out/minikube-darwin-amd64 start -p nospam-356000 -n=1 --memory=2250 --wait=false --log_dir=/var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-356000 --driver=hyperkit : (39.497519573s)
error_spam_test.go:91: acceptable stderr: "! /usr/local/bin/kubectl is version 1.29.2, which may have incompatibilities with Kubernetes 1.31.0."
--- PASS: TestErrorSpam/setup (39.50s)

                                                
                                    
x
+
TestErrorSpam/start (1.77s)

                                                
                                                
=== RUN   TestErrorSpam/start
error_spam_test.go:216: Cleaning up 1 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-356000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-356000 start --dry-run
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-356000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-356000 start --dry-run
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-356000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-356000 start --dry-run
--- PASS: TestErrorSpam/start (1.77s)

                                                
                                    
x
+
TestErrorSpam/status (0.51s)

                                                
                                                
=== RUN   TestErrorSpam/status
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-356000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-356000 status
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-356000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-356000 status
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-356000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-356000 status
--- PASS: TestErrorSpam/status (0.51s)

                                                
                                    
x
+
TestErrorSpam/pause (1.38s)

                                                
                                                
=== RUN   TestErrorSpam/pause
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-356000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-356000 pause
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-356000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-356000 pause
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-356000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-356000 pause
--- PASS: TestErrorSpam/pause (1.38s)

                                                
                                    
x
+
TestErrorSpam/unpause (1.34s)

                                                
                                                
=== RUN   TestErrorSpam/unpause
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-356000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-356000 unpause
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-356000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-356000 unpause
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-356000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-356000 unpause
--- PASS: TestErrorSpam/unpause (1.34s)

                                                
                                    
x
+
TestErrorSpam/stop (153.82s)

                                                
                                                
=== RUN   TestErrorSpam/stop
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-356000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-356000 stop
error_spam_test.go:159: (dbg) Done: out/minikube-darwin-amd64 -p nospam-356000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-356000 stop: (3.364638619s)
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-356000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-356000 stop
error_spam_test.go:159: (dbg) Done: out/minikube-darwin-amd64 -p nospam-356000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-356000 stop: (1m15.226816922s)
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-356000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-356000 stop
error_spam_test.go:182: (dbg) Done: out/minikube-darwin-amd64 -p nospam-356000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-356000 stop: (1m15.225600196s)
--- PASS: TestErrorSpam/stop (153.82s)

                                                
                                    
x
+
TestFunctional/serial/CopySyncFile (0s)

                                                
                                                
=== RUN   TestFunctional/serial/CopySyncFile
functional_test.go:1855: local sync path: /Users/jenkins/minikube-integration/18943-957/.minikube/files/etc/test/nested/copy/1483/hosts
--- PASS: TestFunctional/serial/CopySyncFile (0.00s)

                                                
                                    
x
+
TestFunctional/serial/StartWithProxy (160.95s)

                                                
                                                
=== RUN   TestFunctional/serial/StartWithProxy
functional_test.go:2234: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-593000 --memory=4000 --apiserver-port=8441 --wait=all --driver=hyperkit 
E0831 15:24:15.434227    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:24:15.443460    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:24:15.455420    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:24:15.478205    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:24:15.521866    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:24:15.603478    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:24:15.765018    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:24:16.088601    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:24:16.732294    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:24:18.015293    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:24:20.578986    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:24:25.701813    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:24:35.943644    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:24:56.425540    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:25:37.388336    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
functional_test.go:2234: (dbg) Done: out/minikube-darwin-amd64 start -p functional-593000 --memory=4000 --apiserver-port=8441 --wait=all --driver=hyperkit : (2m40.947865338s)
--- PASS: TestFunctional/serial/StartWithProxy (160.95s)

                                                
                                    
x
+
TestFunctional/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctional/serial/AuditLog
--- PASS: TestFunctional/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctional/serial/SoftStart (40.19s)

                                                
                                                
=== RUN   TestFunctional/serial/SoftStart
functional_test.go:659: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-593000 --alsologtostderr -v=8
functional_test.go:659: (dbg) Done: out/minikube-darwin-amd64 start -p functional-593000 --alsologtostderr -v=8: (40.184659166s)
functional_test.go:663: soft start took 40.185201669s for "functional-593000" cluster.
--- PASS: TestFunctional/serial/SoftStart (40.19s)

                                                
                                    
x
+
TestFunctional/serial/KubeContext (0.04s)

                                                
                                                
=== RUN   TestFunctional/serial/KubeContext
functional_test.go:681: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctional/serial/KubeContext (0.04s)

                                                
                                    
x
+
TestFunctional/serial/KubectlGetPods (0.07s)

                                                
                                                
=== RUN   TestFunctional/serial/KubectlGetPods
functional_test.go:696: (dbg) Run:  kubectl --context functional-593000 get po -A
--- PASS: TestFunctional/serial/KubectlGetPods (0.07s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_remote (3.03s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_remote
functional_test.go:1049: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 cache add registry.k8s.io/pause:3.1
functional_test.go:1049: (dbg) Done: out/minikube-darwin-amd64 -p functional-593000 cache add registry.k8s.io/pause:3.1: (1.06434895s)
functional_test.go:1049: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 cache add registry.k8s.io/pause:3.3
functional_test.go:1049: (dbg) Done: out/minikube-darwin-amd64 -p functional-593000 cache add registry.k8s.io/pause:3.3: (1.027801704s)
functional_test.go:1049: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 cache add registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/add_remote (3.03s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_local (1.33s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_local
functional_test.go:1077: (dbg) Run:  docker build -t minikube-local-cache-test:functional-593000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalserialCacheCmdcacheadd_local910217610/001
functional_test.go:1089: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 cache add minikube-local-cache-test:functional-593000
functional_test.go:1094: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 cache delete minikube-local-cache-test:functional-593000
functional_test.go:1083: (dbg) Run:  docker rmi minikube-local-cache-test:functional-593000
--- PASS: TestFunctional/serial/CacheCmd/cache/add_local (1.33s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/CacheDelete (0.08s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/CacheDelete
functional_test.go:1102: (dbg) Run:  out/minikube-darwin-amd64 cache delete registry.k8s.io/pause:3.3
--- PASS: TestFunctional/serial/CacheCmd/cache/CacheDelete (0.08s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/list (0.08s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/list
functional_test.go:1110: (dbg) Run:  out/minikube-darwin-amd64 cache list
--- PASS: TestFunctional/serial/CacheCmd/cache/list (0.08s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.17s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1124: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh sudo crictl images
--- PASS: TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.17s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/cache_reload (1.02s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/cache_reload
functional_test.go:1147: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh sudo docker rmi registry.k8s.io/pause:latest
functional_test.go:1153: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh sudo crictl inspecti registry.k8s.io/pause:latest
functional_test.go:1153: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-593000 ssh sudo crictl inspecti registry.k8s.io/pause:latest: exit status 1 (145.241529ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "registry.k8s.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1158: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 cache reload
functional_test.go:1163: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh sudo crictl inspecti registry.k8s.io/pause:latest
E0831 15:26:59.311468    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
--- PASS: TestFunctional/serial/CacheCmd/cache/cache_reload (1.02s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/delete (0.16s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/delete
functional_test.go:1172: (dbg) Run:  out/minikube-darwin-amd64 cache delete registry.k8s.io/pause:3.1
functional_test.go:1172: (dbg) Run:  out/minikube-darwin-amd64 cache delete registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/delete (0.16s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmd (1.2s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmd
functional_test.go:716: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 kubectl -- --context functional-593000 get pods
functional_test.go:716: (dbg) Done: out/minikube-darwin-amd64 -p functional-593000 kubectl -- --context functional-593000 get pods: (1.202055285s)
--- PASS: TestFunctional/serial/MinikubeKubectlCmd (1.20s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmdDirectly (1.56s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmdDirectly
functional_test.go:741: (dbg) Run:  out/kubectl --context functional-593000 get pods
functional_test.go:741: (dbg) Done: out/kubectl --context functional-593000 get pods: (1.560358226s)
--- PASS: TestFunctional/serial/MinikubeKubectlCmdDirectly (1.56s)

                                                
                                    
x
+
TestFunctional/serial/ExtraConfig (39.26s)

                                                
                                                
=== RUN   TestFunctional/serial/ExtraConfig
functional_test.go:757: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-593000 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
functional_test.go:757: (dbg) Done: out/minikube-darwin-amd64 start -p functional-593000 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: (39.259201851s)
functional_test.go:761: restart took 39.259308348s for "functional-593000" cluster.
--- PASS: TestFunctional/serial/ExtraConfig (39.26s)

                                                
                                    
x
+
TestFunctional/serial/ComponentHealth (0.06s)

                                                
                                                
=== RUN   TestFunctional/serial/ComponentHealth
functional_test.go:810: (dbg) Run:  kubectl --context functional-593000 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:825: etcd phase: Running
functional_test.go:835: etcd status: Ready
functional_test.go:825: kube-apiserver phase: Running
functional_test.go:835: kube-apiserver status: Ready
functional_test.go:825: kube-controller-manager phase: Running
functional_test.go:835: kube-controller-manager status: Ready
functional_test.go:825: kube-scheduler phase: Running
functional_test.go:835: kube-scheduler status: Ready
--- PASS: TestFunctional/serial/ComponentHealth (0.06s)

                                                
                                    
x
+
TestFunctional/serial/LogsCmd (2.76s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsCmd
functional_test.go:1236: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 logs
functional_test.go:1236: (dbg) Done: out/minikube-darwin-amd64 -p functional-593000 logs: (2.759839191s)
--- PASS: TestFunctional/serial/LogsCmd (2.76s)

                                                
                                    
x
+
TestFunctional/serial/LogsFileCmd (2.86s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsFileCmd
functional_test.go:1250: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 logs --file /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalserialLogsFileCmd1724239274/001/logs.txt
functional_test.go:1250: (dbg) Done: out/minikube-darwin-amd64 -p functional-593000 logs --file /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalserialLogsFileCmd1724239274/001/logs.txt: (2.860459424s)
--- PASS: TestFunctional/serial/LogsFileCmd (2.86s)

                                                
                                    
x
+
TestFunctional/serial/InvalidService (4.5s)

                                                
                                                
=== RUN   TestFunctional/serial/InvalidService
functional_test.go:2321: (dbg) Run:  kubectl --context functional-593000 apply -f testdata/invalidsvc.yaml
functional_test.go:2335: (dbg) Run:  out/minikube-darwin-amd64 service invalid-svc -p functional-593000
functional_test.go:2335: (dbg) Non-zero exit: out/minikube-darwin-amd64 service invalid-svc -p functional-593000: exit status 115 (264.632345ms)

                                                
                                                
-- stdout --
	|-----------|-------------|-------------|--------------------------|
	| NAMESPACE |    NAME     | TARGET PORT |           URL            |
	|-----------|-------------|-------------|--------------------------|
	| default   | invalid-svc |          80 | http://192.169.0.4:30540 |
	|-----------|-------------|-------------|--------------------------|
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to SVC_UNREACHABLE: service not available: no running pod for service invalid-svc found
	* 
	╭────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                            │
	│    * If the above advice does not help, please let us know:                                                                │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                              │
	│                                                                                                                            │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                   │
	│    * Please also attach the following file to the GitHub issue:                                                            │
	│    * - /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube_service_96b204199e3191fa1740d4430b018a3c8028d52d_0.log    │
	│                                                                                                                            │
	╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
functional_test.go:2327: (dbg) Run:  kubectl --context functional-593000 delete -f testdata/invalidsvc.yaml
functional_test.go:2327: (dbg) Done: kubectl --context functional-593000 delete -f testdata/invalidsvc.yaml: (1.105525983s)
--- PASS: TestFunctional/serial/InvalidService (4.50s)

                                                
                                    
x
+
TestFunctional/parallel/ConfigCmd (0.52s)

                                                
                                                
=== RUN   TestFunctional/parallel/ConfigCmd
=== PAUSE TestFunctional/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1199: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 config unset cpus
functional_test.go:1199: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 config get cpus
functional_test.go:1199: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-593000 config get cpus: exit status 14 (70.390015ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1199: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 config set cpus 2
functional_test.go:1199: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 config get cpus
functional_test.go:1199: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 config unset cpus
functional_test.go:1199: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 config get cpus
functional_test.go:1199: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-593000 config get cpus: exit status 14 (55.391409ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/ConfigCmd (0.52s)

                                                
                                    
x
+
TestFunctional/parallel/DashboardCmd (13.14s)

                                                
                                                
=== RUN   TestFunctional/parallel/DashboardCmd
=== PAUSE TestFunctional/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DashboardCmd
functional_test.go:905: (dbg) daemon: [out/minikube-darwin-amd64 dashboard --url --port 36195 -p functional-593000 --alsologtostderr -v=1]
functional_test.go:910: (dbg) stopping [out/minikube-darwin-amd64 dashboard --url --port 36195 -p functional-593000 --alsologtostderr -v=1] ...
helpers_test.go:509: unable to kill pid 2663: os: process already finished
--- PASS: TestFunctional/parallel/DashboardCmd (13.14s)

                                                
                                    
x
+
TestFunctional/parallel/DryRun (1.8s)

                                                
                                                
=== RUN   TestFunctional/parallel/DryRun
=== PAUSE TestFunctional/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DryRun
functional_test.go:974: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-593000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit 
functional_test.go:974: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p functional-593000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit : exit status 23 (1.291631423s)

                                                
                                                
-- stdout --
	* [functional-593000] minikube v1.33.1 on Darwin 14.6.1
	  - MINIKUBE_LOCATION=18943
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 15:28:28.032798    2593 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:28:28.053322    2593 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:28:28.053335    2593 out.go:358] Setting ErrFile to fd 2...
	I0831 15:28:28.053341    2593 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:28:28.053604    2593 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:28:28.095959    2593 out.go:352] Setting JSON to false
	I0831 15:28:28.120362    2593 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":1679,"bootTime":1725141629,"procs":508,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.6.1","kernelVersion":"23.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0831 15:28:28.120460    2593 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0831 15:28:28.224292    2593 out.go:177] * [functional-593000] minikube v1.33.1 on Darwin 14.6.1
	I0831 15:28:28.287492    2593 notify.go:220] Checking for updates...
	I0831 15:28:28.328986    2593 out.go:177]   - MINIKUBE_LOCATION=18943
	I0831 15:28:28.433932    2593 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:28:28.538815    2593 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0831 15:28:28.643806    2593 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0831 15:28:28.749259    2593 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:28:28.909198    2593 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0831 15:28:28.989363    2593 config.go:182] Loaded profile config "functional-593000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:28:28.990094    2593 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:28:28.990206    2593 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:28:29.000084    2593 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50746
	I0831 15:28:29.000457    2593 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:28:29.000966    2593 main.go:141] libmachine: Using API Version  1
	I0831 15:28:29.000982    2593 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:28:29.001237    2593 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:28:29.001366    2593 main.go:141] libmachine: (functional-593000) Calling .DriverName
	I0831 15:28:29.001574    2593 driver.go:392] Setting default libvirt URI to qemu:///system
	I0831 15:28:29.001817    2593 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:28:29.001838    2593 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:28:29.010290    2593 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50748
	I0831 15:28:29.010642    2593 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:28:29.010954    2593 main.go:141] libmachine: Using API Version  1
	I0831 15:28:29.010964    2593 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:28:29.011206    2593 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:28:29.011327    2593 main.go:141] libmachine: (functional-593000) Calling .DriverName
	I0831 15:28:29.047950    2593 out.go:177] * Using the hyperkit driver based on existing profile
	I0831 15:28:29.090023    2593 start.go:297] selected driver: hyperkit
	I0831 15:28:29.090051    2593 start.go:901] validating driver "hyperkit" against &{Name:functional-593000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:4000 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfi
g:{KubernetesVersion:v1.31.0 ClusterName:functional-593000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.4 Port:8441 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:2628
0h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:28:29.090247    2593 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0831 15:28:29.135038    2593 out.go:201] 
	W0831 15:28:29.155869    2593 out.go:270] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I0831 15:28:29.177113    2593 out.go:201] 

                                                
                                                
** /stderr **
functional_test.go:991: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-593000 --dry-run --alsologtostderr -v=1 --driver=hyperkit 
--- PASS: TestFunctional/parallel/DryRun (1.80s)

                                                
                                    
x
+
TestFunctional/parallel/InternationalLanguage (0.63s)

                                                
                                                
=== RUN   TestFunctional/parallel/InternationalLanguage
=== PAUSE TestFunctional/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/InternationalLanguage
functional_test.go:1020: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-593000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit 
functional_test.go:1020: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p functional-593000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit : exit status 23 (626.2185ms)

                                                
                                                
-- stdout --
	* [functional-593000] minikube v1.33.1 sur Darwin 14.6.1
	  - MINIKUBE_LOCATION=18943
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Utilisation du pilote hyperkit basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 15:28:27.367479    2561 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:28:27.367789    2561 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:28:27.367800    2561 out.go:358] Setting ErrFile to fd 2...
	I0831 15:28:27.367807    2561 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:28:27.368078    2561 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:28:27.370031    2561 out.go:352] Setting JSON to false
	I0831 15:28:27.393441    2561 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":1678,"bootTime":1725141629,"procs":494,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.6.1","kernelVersion":"23.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0831 15:28:27.393568    2561 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0831 15:28:27.415059    2561 out.go:177] * [functional-593000] minikube v1.33.1 sur Darwin 14.6.1
	I0831 15:28:27.458381    2561 notify.go:220] Checking for updates...
	I0831 15:28:27.480028    2561 out.go:177]   - MINIKUBE_LOCATION=18943
	I0831 15:28:27.500831    2561 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	I0831 15:28:27.522037    2561 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0831 15:28:27.579894    2561 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0831 15:28:27.622147    2561 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	I0831 15:28:27.642898    2561 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0831 15:28:27.664678    2561 config.go:182] Loaded profile config "functional-593000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:28:27.665164    2561 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:28:27.665215    2561 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:28:27.674258    2561 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50709
	I0831 15:28:27.674647    2561 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:28:27.675095    2561 main.go:141] libmachine: Using API Version  1
	I0831 15:28:27.675107    2561 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:28:27.675362    2561 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:28:27.675480    2561 main.go:141] libmachine: (functional-593000) Calling .DriverName
	I0831 15:28:27.675698    2561 driver.go:392] Setting default libvirt URI to qemu:///system
	I0831 15:28:27.675975    2561 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:28:27.676003    2561 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:28:27.685045    2561 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50711
	I0831 15:28:27.685500    2561 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:28:27.685888    2561 main.go:141] libmachine: Using API Version  1
	I0831 15:28:27.685911    2561 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:28:27.686150    2561 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:28:27.686275    2561 main.go:141] libmachine: (functional-593000) Calling .DriverName
	I0831 15:28:27.716092    2561 out.go:177] * Utilisation du pilote hyperkit basé sur le profil existant
	I0831 15:28:27.758313    2561 start.go:297] selected driver: hyperkit
	I0831 15:28:27.758343    2561 start.go:901] validating driver "hyperkit" against &{Name:functional-593000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19530/minikube-v1.33.1-1724862017-19530-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1724862063-19530@sha256:fd0f41868bf20a720502cce04c5201bfb064f3c267161af6fd5265d69c85c9f0 Memory:4000 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfi
g:{KubernetesVersion:v1.31.0 ClusterName:functional-593000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.4 Port:8441 KubernetesVersion:v1.31.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:2628
0h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0831 15:28:27.758533    2561 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0831 15:28:27.785056    2561 out.go:201] 
	W0831 15:28:27.805859    2561 out.go:270] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I0831 15:28:27.864076    2561 out.go:201] 

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/InternationalLanguage (0.63s)

                                                
                                    
x
+
TestFunctional/parallel/StatusCmd (0.51s)

                                                
                                                
=== RUN   TestFunctional/parallel/StatusCmd
=== PAUSE TestFunctional/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/StatusCmd
functional_test.go:854: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 status
functional_test.go:860: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:872: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 status -o json
--- PASS: TestFunctional/parallel/StatusCmd (0.51s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmdConnect (12.36s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmdConnect
=== PAUSE TestFunctional/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1629: (dbg) Run:  kubectl --context functional-593000 create deployment hello-node-connect --image=registry.k8s.io/echoserver:1.8
functional_test.go:1635: (dbg) Run:  kubectl --context functional-593000 expose deployment hello-node-connect --type=NodePort --port=8080
functional_test.go:1640: (dbg) TestFunctional/parallel/ServiceCmdConnect: waiting 10m0s for pods matching "app=hello-node-connect" in namespace "default" ...
helpers_test.go:345: "hello-node-connect-67bdd5bbb4-mzwml" [0061e17d-1cc7-42e2-8c2e-dc2ed1860224] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])
helpers_test.go:345: "hello-node-connect-67bdd5bbb4-mzwml" [0061e17d-1cc7-42e2-8c2e-dc2ed1860224] Running
functional_test.go:1640: (dbg) TestFunctional/parallel/ServiceCmdConnect: app=hello-node-connect healthy within 12.005607406s
functional_test.go:1649: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 service hello-node-connect --url
functional_test.go:1655: found endpoint for hello-node-connect: http://192.169.0.4:31887
functional_test.go:1675: http://192.169.0.4:31887: success! body:

                                                
                                                

                                                
                                                
Hostname: hello-node-connect-67bdd5bbb4-mzwml

                                                
                                                
Pod Information:
	-no pod information available-

                                                
                                                
Server values:
	server_version=nginx: 1.13.3 - lua: 10008

                                                
                                                
Request Information:
	client_address=10.244.0.1
	method=GET
	real path=/
	query=
	request_version=1.1
	request_uri=http://192.169.0.4:8080/

                                                
                                                
Request Headers:
	accept-encoding=gzip
	host=192.169.0.4:31887
	user-agent=Go-http-client/1.1

                                                
                                                
Request Body:
	-no body in request-

                                                
                                                
--- PASS: TestFunctional/parallel/ServiceCmdConnect (12.36s)

                                                
                                    
x
+
TestFunctional/parallel/AddonsCmd (0.24s)

                                                
                                                
=== RUN   TestFunctional/parallel/AddonsCmd
=== PAUSE TestFunctional/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/AddonsCmd
functional_test.go:1690: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 addons list
functional_test.go:1702: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 addons list -o json
--- PASS: TestFunctional/parallel/AddonsCmd (0.24s)

                                                
                                    
x
+
TestFunctional/parallel/PersistentVolumeClaim (26.43s)

                                                
                                                
=== RUN   TestFunctional/parallel/PersistentVolumeClaim
=== PAUSE TestFunctional/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:345: "storage-provisioner" [dfdc7df4-ccc1-463f-a05f-8f452be29a52] Running
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: integration-test=storage-provisioner healthy within 6.004895895s
functional_test_pvc_test.go:49: (dbg) Run:  kubectl --context functional-593000 get storageclass -o=json
functional_test_pvc_test.go:69: (dbg) Run:  kubectl --context functional-593000 apply -f testdata/storage-provisioner/pvc.yaml
functional_test_pvc_test.go:76: (dbg) Run:  kubectl --context functional-593000 get pvc myclaim -o=json
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-593000 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:345: "sp-pod" [0e4ce45a-b694-4258-b962-c197ca5fb869] Pending
helpers_test.go:345: "sp-pod" [0e4ce45a-b694-4258-b962-c197ca5fb869] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:345: "sp-pod" [0e4ce45a-b694-4258-b962-c197ca5fb869] Running
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 12.005023834s
functional_test_pvc_test.go:100: (dbg) Run:  kubectl --context functional-593000 exec sp-pod -- touch /tmp/mount/foo
functional_test_pvc_test.go:106: (dbg) Run:  kubectl --context functional-593000 delete -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-593000 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:345: "sp-pod" [e0911d3e-1011-440a-a80a-b394add7ab9e] Pending
helpers_test.go:345: "sp-pod" [e0911d3e-1011-440a-a80a-b394add7ab9e] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:345: "sp-pod" [e0911d3e-1011-440a-a80a-b394add7ab9e] Running
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 7.005344041s
functional_test_pvc_test.go:114: (dbg) Run:  kubectl --context functional-593000 exec sp-pod -- ls /tmp/mount
--- PASS: TestFunctional/parallel/PersistentVolumeClaim (26.43s)

                                                
                                    
x
+
TestFunctional/parallel/SSHCmd (0.31s)

                                                
                                                
=== RUN   TestFunctional/parallel/SSHCmd
=== PAUSE TestFunctional/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd
functional_test.go:1725: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh "echo hello"
functional_test.go:1742: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh "cat /etc/hostname"
--- PASS: TestFunctional/parallel/SSHCmd (0.31s)

                                                
                                    
x
+
TestFunctional/parallel/CpCmd (0.96s)

                                                
                                                
=== RUN   TestFunctional/parallel/CpCmd
=== PAUSE TestFunctional/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:557: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 cp testdata/cp-test.txt /home/docker/cp-test.txt
helpers_test.go:535: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh -n functional-593000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:557: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 cp functional-593000:/home/docker/cp-test.txt /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelCpCmd2610750758/001/cp-test.txt
helpers_test.go:535: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh -n functional-593000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:557: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt
helpers_test.go:535: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh -n functional-593000 "sudo cat /tmp/does/not/exist/cp-test.txt"
--- PASS: TestFunctional/parallel/CpCmd (0.96s)

                                                
                                    
x
+
TestFunctional/parallel/MySQL (25.55s)

                                                
                                                
=== RUN   TestFunctional/parallel/MySQL
=== PAUSE TestFunctional/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1793: (dbg) Run:  kubectl --context functional-593000 replace --force -f testdata/mysql.yaml
functional_test.go:1799: (dbg) TestFunctional/parallel/MySQL: waiting 10m0s for pods matching "app=mysql" in namespace "default" ...
helpers_test.go:345: "mysql-6cdb49bbb-hnzlz" [d3ed461a-f1fc-4390-9252-439561fc40d9] Pending / Ready:ContainersNotReady (containers with unready status: [mysql]) / ContainersReady:ContainersNotReady (containers with unready status: [mysql])
2024/08/31 15:28:42 [DEBUG] GET http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/
helpers_test.go:345: "mysql-6cdb49bbb-hnzlz" [d3ed461a-f1fc-4390-9252-439561fc40d9] Running
functional_test.go:1799: (dbg) TestFunctional/parallel/MySQL: app=mysql healthy within 23.003125369s
functional_test.go:1807: (dbg) Run:  kubectl --context functional-593000 exec mysql-6cdb49bbb-hnzlz -- mysql -ppassword -e "show databases;"
functional_test.go:1807: (dbg) Non-zero exit: kubectl --context functional-593000 exec mysql-6cdb49bbb-hnzlz -- mysql -ppassword -e "show databases;": exit status 1 (108.743592ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1807: (dbg) Run:  kubectl --context functional-593000 exec mysql-6cdb49bbb-hnzlz -- mysql -ppassword -e "show databases;"
functional_test.go:1807: (dbg) Non-zero exit: kubectl --context functional-593000 exec mysql-6cdb49bbb-hnzlz -- mysql -ppassword -e "show databases;": exit status 1 (101.259258ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1807: (dbg) Run:  kubectl --context functional-593000 exec mysql-6cdb49bbb-hnzlz -- mysql -ppassword -e "show databases;"
--- PASS: TestFunctional/parallel/MySQL (25.55s)

                                                
                                    
x
+
TestFunctional/parallel/FileSync (0.15s)

                                                
                                                
=== RUN   TestFunctional/parallel/FileSync
=== PAUSE TestFunctional/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/FileSync
functional_test.go:1929: Checking for existence of /etc/test/nested/copy/1483/hosts within VM
functional_test.go:1931: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh "sudo cat /etc/test/nested/copy/1483/hosts"
functional_test.go:1936: file sync test content: Test file for checking file sync process
--- PASS: TestFunctional/parallel/FileSync (0.15s)

                                                
                                    
x
+
TestFunctional/parallel/CertSync (0.93s)

                                                
                                                
=== RUN   TestFunctional/parallel/CertSync
=== PAUSE TestFunctional/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CertSync
functional_test.go:1972: Checking for existence of /etc/ssl/certs/1483.pem within VM
functional_test.go:1973: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh "sudo cat /etc/ssl/certs/1483.pem"
functional_test.go:1972: Checking for existence of /usr/share/ca-certificates/1483.pem within VM
functional_test.go:1973: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh "sudo cat /usr/share/ca-certificates/1483.pem"
functional_test.go:1972: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1973: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:1999: Checking for existence of /etc/ssl/certs/14832.pem within VM
functional_test.go:2000: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh "sudo cat /etc/ssl/certs/14832.pem"
functional_test.go:1999: Checking for existence of /usr/share/ca-certificates/14832.pem within VM
functional_test.go:2000: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh "sudo cat /usr/share/ca-certificates/14832.pem"
functional_test.go:1999: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:2000: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctional/parallel/CertSync (0.93s)

                                                
                                    
x
+
TestFunctional/parallel/NodeLabels (0.05s)

                                                
                                                
=== RUN   TestFunctional/parallel/NodeLabels
=== PAUSE TestFunctional/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NodeLabels
functional_test.go:219: (dbg) Run:  kubectl --context functional-593000 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
--- PASS: TestFunctional/parallel/NodeLabels (0.05s)

                                                
                                    
x
+
TestFunctional/parallel/NonActiveRuntimeDisabled (0.2s)

                                                
                                                
=== RUN   TestFunctional/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctional/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NonActiveRuntimeDisabled
functional_test.go:2027: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh "sudo systemctl is-active crio"
functional_test.go:2027: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-593000 ssh "sudo systemctl is-active crio": exit status 1 (202.889087ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/NonActiveRuntimeDisabled (0.20s)

                                                
                                    
x
+
TestFunctional/parallel/License (0.47s)

                                                
                                                
=== RUN   TestFunctional/parallel/License
=== PAUSE TestFunctional/parallel/License

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/License
functional_test.go:2288: (dbg) Run:  out/minikube-darwin-amd64 license
--- PASS: TestFunctional/parallel/License (0.47s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.37s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-darwin-amd64 -p functional-593000 tunnel --alsologtostderr]
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-darwin-amd64 -p functional-593000 tunnel --alsologtostderr]
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-darwin-amd64 -p functional-593000 tunnel --alsologtostderr] ...
helpers_test.go:509: unable to kill pid 2355: os: process already finished
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-darwin-amd64 -p functional-593000 tunnel --alsologtostderr] ...
helpers_test.go:491: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.37s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:129: (dbg) daemon: [out/minikube-darwin-amd64 -p functional-593000 tunnel --alsologtostderr]
--- PASS: TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (12.13s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:212: (dbg) Run:  kubectl --context functional-593000 apply -f testdata/testsvc.yaml
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: waiting 4m0s for pods matching "run=nginx-svc" in namespace "default" ...
helpers_test.go:345: "nginx-svc" [7906d49e-939d-43fe-a552-481211796a5d] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:345: "nginx-svc" [7906d49e-939d-43fe-a552-481211796a5d] Running
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: run=nginx-svc healthy within 12.002879379s
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (12.13s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.05s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP
functional_test_tunnel_test.go:234: (dbg) Run:  kubectl --context functional-593000 get svc nginx-svc -o jsonpath={.status.loadBalancer.ingress[0].ip}
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.05s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessDirect
functional_test_tunnel_test.go:299: tunnel at http://10.96.172.16 is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.04s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:319: (dbg) Run:  dig +time=5 +tries=3 @10.96.0.10 nginx-svc.default.svc.cluster.local. A
functional_test_tunnel_test.go:327: DNS resolution by dig for nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.04s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.03s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:351: (dbg) Run:  dscacheutil -q host -a name nginx-svc.default.svc.cluster.local.
functional_test_tunnel_test.go:359: DNS resolution by dscacheutil for nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.03s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:424: tunnel at http://nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.13s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:434: (dbg) stopping [out/minikube-darwin-amd64 -p functional-593000 tunnel --alsologtostderr] ...
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.13s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/DeployApp (7.14s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/DeployApp
functional_test.go:1439: (dbg) Run:  kubectl --context functional-593000 create deployment hello-node --image=registry.k8s.io/echoserver:1.8
functional_test.go:1445: (dbg) Run:  kubectl --context functional-593000 expose deployment hello-node --type=NodePort --port=8080
functional_test.go:1450: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: waiting 10m0s for pods matching "app=hello-node" in namespace "default" ...
helpers_test.go:345: "hello-node-6b9f76b5c7-fz89m" [615b28bb-5de0-4ad2-bfb7-c3c8fdfa60a1] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])
helpers_test.go:345: "hello-node-6b9f76b5c7-fz89m" [615b28bb-5de0-4ad2-bfb7-c3c8fdfa60a1] Running
functional_test.go:1450: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: app=hello-node healthy within 7.004961565s
--- PASS: TestFunctional/parallel/ServiceCmd/DeployApp (7.14s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_not_create (0.25s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_not_create
functional_test.go:1270: (dbg) Run:  out/minikube-darwin-amd64 profile lis
functional_test.go:1275: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestFunctional/parallel/ProfileCmd/profile_not_create (0.25s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_list (0.25s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_list
functional_test.go:1310: (dbg) Run:  out/minikube-darwin-amd64 profile list
functional_test.go:1315: Took "174.066316ms" to run "out/minikube-darwin-amd64 profile list"
functional_test.go:1324: (dbg) Run:  out/minikube-darwin-amd64 profile list -l
functional_test.go:1329: Took "78.868186ms" to run "out/minikube-darwin-amd64 profile list -l"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_list (0.25s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_json_output (0.25s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_json_output
functional_test.go:1361: (dbg) Run:  out/minikube-darwin-amd64 profile list -o json
functional_test.go:1366: Took "175.362294ms" to run "out/minikube-darwin-amd64 profile list -o json"
functional_test.go:1374: (dbg) Run:  out/minikube-darwin-amd64 profile list -o json --light
functional_test.go:1379: Took "78.156624ms" to run "out/minikube-darwin-amd64 profile list -o json --light"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_json_output (0.25s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/any-port (6.16s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/any-port
functional_test_mount_test.go:73: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-593000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdany-port3356027891/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:107: wrote "test-1725143299901960000" to /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdany-port3356027891/001/created-by-test
functional_test_mount_test.go:107: wrote "test-1725143299901960000" to /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdany-port3356027891/001/created-by-test-removed-by-pod
functional_test_mount_test.go:107: wrote "test-1725143299901960000" to /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdany-port3356027891/001/test-1725143299901960000
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:115: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-593000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (118.203511ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:129: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh -- ls -la /mount-9p
functional_test_mount_test.go:133: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Aug 31 22:28 created-by-test
-rw-r--r-- 1 docker docker 24 Aug 31 22:28 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Aug 31 22:28 test-1725143299901960000
functional_test_mount_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh cat /mount-9p/test-1725143299901960000
functional_test_mount_test.go:148: (dbg) Run:  kubectl --context functional-593000 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: waiting 4m0s for pods matching "integration-test=busybox-mount" in namespace "default" ...
helpers_test.go:345: "busybox-mount" [be294d22-e0d4-4f6a-9716-f502d6c8508a] Pending
helpers_test.go:345: "busybox-mount" [be294d22-e0d4-4f6a-9716-f502d6c8508a] Pending / Ready:ContainersNotReady (containers with unready status: [mount-munger]) / ContainersReady:ContainersNotReady (containers with unready status: [mount-munger])
helpers_test.go:345: "busybox-mount" [be294d22-e0d4-4f6a-9716-f502d6c8508a] Pending / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:345: "busybox-mount" [be294d22-e0d4-4f6a-9716-f502d6c8508a] Succeeded / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: integration-test=busybox-mount healthy within 4.022415775s
functional_test_mount_test.go:169: (dbg) Run:  kubectl --context functional-593000 logs busybox-mount
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh stat /mount-9p/created-by-test
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh stat /mount-9p/created-by-pod
functional_test_mount_test.go:90: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:94: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-593000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdany-port3356027891/001:/mount-9p --alsologtostderr -v=1] ...
--- PASS: TestFunctional/parallel/MountCmd/any-port (6.16s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/List (0.39s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/List
functional_test.go:1459: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 service list
--- PASS: TestFunctional/parallel/ServiceCmd/List (0.39s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/JSONOutput (0.38s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/JSONOutput
functional_test.go:1489: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 service list -o json
functional_test.go:1494: Took "376.089738ms" to run "out/minikube-darwin-amd64 -p functional-593000 service list -o json"
--- PASS: TestFunctional/parallel/ServiceCmd/JSONOutput (0.38s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/HTTPS (0.29s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/HTTPS
functional_test.go:1509: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 service --namespace=default --https --url hello-node
functional_test.go:1522: found endpoint: https://192.169.0.4:32252
--- PASS: TestFunctional/parallel/ServiceCmd/HTTPS (0.29s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/specific-port (1.62s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/specific-port
functional_test_mount_test.go:213: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-593000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdspecific-port1334742505/001:/mount-9p --alsologtostderr -v=1 --port 46464]
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-593000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (185.496195ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:257: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh -- ls -la /mount-9p
functional_test_mount_test.go:261: guest mount directory contents
total 0
functional_test_mount_test.go:263: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-593000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdspecific-port1334742505/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
functional_test_mount_test.go:264: reading mount text
functional_test_mount_test.go:278: done reading mount text
functional_test_mount_test.go:230: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:230: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-593000 ssh "sudo umount -f /mount-9p": exit status 1 (189.511014ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:232: "out/minikube-darwin-amd64 -p functional-593000 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:234: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-593000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdspecific-port1334742505/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
--- PASS: TestFunctional/parallel/MountCmd/specific-port (1.62s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/Format (0.33s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/Format
functional_test.go:1540: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 service hello-node --url --format={{.IP}}
--- PASS: TestFunctional/parallel/ServiceCmd/Format (0.33s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/URL (0.27s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/URL
functional_test.go:1559: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 service hello-node --url
functional_test.go:1565: found endpoint for hello-node: http://192.169.0.4:32252
--- PASS: TestFunctional/parallel/ServiceCmd/URL (0.27s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/VerifyCleanup (1.99s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/VerifyCleanup
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-593000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdVerifyCleanup2592559725/001:/mount1 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-593000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdVerifyCleanup2592559725/001:/mount2 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-593000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdVerifyCleanup2592559725/001:/mount3 --alsologtostderr -v=1]
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-593000 ssh "findmnt -T" /mount1: exit status 1 (181.152412ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh "findmnt -T" /mount2
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh "findmnt -T" /mount3
functional_test_mount_test.go:370: (dbg) Run:  out/minikube-darwin-amd64 mount -p functional-593000 --kill=true
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-593000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdVerifyCleanup2592559725/001:/mount1 --alsologtostderr -v=1] ...
helpers_test.go:491: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-593000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdVerifyCleanup2592559725/001:/mount2 --alsologtostderr -v=1] ...
helpers_test.go:491: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-593000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdVerifyCleanup2592559725/001:/mount3 --alsologtostderr -v=1] ...
helpers_test.go:491: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctional/parallel/MountCmd/VerifyCleanup (1.99s)

                                                
                                    
x
+
TestFunctional/parallel/Version/short (0.1s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/short
=== PAUSE TestFunctional/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/short
functional_test.go:2256: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 version --short
--- PASS: TestFunctional/parallel/Version/short (0.10s)

                                                
                                    
x
+
TestFunctional/parallel/Version/components (0.5s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/components
=== PAUSE TestFunctional/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/components
functional_test.go:2270: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 version -o=json --components
--- PASS: TestFunctional/parallel/Version/components (0.50s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListShort (0.19s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListShort
functional_test.go:261: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 image ls --format short --alsologtostderr
functional_test.go:266: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-593000 image ls --format short --alsologtostderr:
registry.k8s.io/pause:latest
registry.k8s.io/pause:3.3
registry.k8s.io/pause:3.10
registry.k8s.io/pause:3.1
registry.k8s.io/kube-scheduler:v1.31.0
registry.k8s.io/kube-proxy:v1.31.0
registry.k8s.io/kube-controller-manager:v1.31.0
registry.k8s.io/kube-apiserver:v1.31.0
registry.k8s.io/etcd:3.5.15-0
registry.k8s.io/echoserver:1.8
registry.k8s.io/coredns/coredns:v1.11.1
gcr.io/k8s-minikube/storage-provisioner:v5
gcr.io/k8s-minikube/busybox:1.28.4-glibc
docker.io/library/nginx:latest
docker.io/library/nginx:alpine
docker.io/library/minikube-local-cache-test:functional-593000
docker.io/kubernetesui/metrics-scraper:<none>
docker.io/kubernetesui/dashboard:<none>
docker.io/kicbase/echo-server:functional-593000
functional_test.go:269: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-593000 image ls --format short --alsologtostderr:
I0831 15:28:43.457428    2822 out.go:345] Setting OutFile to fd 1 ...
I0831 15:28:43.457626    2822 out.go:392] TERM=,COLORTERM=, which probably does not support color
I0831 15:28:43.457632    2822 out.go:358] Setting ErrFile to fd 2...
I0831 15:28:43.457636    2822 out.go:392] TERM=,COLORTERM=, which probably does not support color
I0831 15:28:43.457825    2822 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
I0831 15:28:43.458418    2822 config.go:182] Loaded profile config "functional-593000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
I0831 15:28:43.458511    2822 config.go:182] Loaded profile config "functional-593000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
I0831 15:28:43.458867    2822 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0831 15:28:43.458912    2822 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0831 15:28:43.467429    2822 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50969
I0831 15:28:43.467882    2822 main.go:141] libmachine: () Calling .GetVersion
I0831 15:28:43.468338    2822 main.go:141] libmachine: Using API Version  1
I0831 15:28:43.468348    2822 main.go:141] libmachine: () Calling .SetConfigRaw
I0831 15:28:43.468563    2822 main.go:141] libmachine: () Calling .GetMachineName
I0831 15:28:43.468706    2822 main.go:141] libmachine: (functional-593000) Calling .GetState
I0831 15:28:43.468804    2822 main.go:141] libmachine: (functional-593000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0831 15:28:43.468886    2822 main.go:141] libmachine: (functional-593000) DBG | hyperkit pid from json: 2119
I0831 15:28:43.470171    2822 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0831 15:28:43.470193    2822 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0831 15:28:43.478705    2822 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50971
I0831 15:28:43.479070    2822 main.go:141] libmachine: () Calling .GetVersion
I0831 15:28:43.479395    2822 main.go:141] libmachine: Using API Version  1
I0831 15:28:43.479407    2822 main.go:141] libmachine: () Calling .SetConfigRaw
I0831 15:28:43.479661    2822 main.go:141] libmachine: () Calling .GetMachineName
I0831 15:28:43.479761    2822 main.go:141] libmachine: (functional-593000) Calling .DriverName
I0831 15:28:43.479913    2822 ssh_runner.go:195] Run: systemctl --version
I0831 15:28:43.479930    2822 main.go:141] libmachine: (functional-593000) Calling .GetSSHHostname
I0831 15:28:43.480009    2822 main.go:141] libmachine: (functional-593000) Calling .GetSSHPort
I0831 15:28:43.480091    2822 main.go:141] libmachine: (functional-593000) Calling .GetSSHKeyPath
I0831 15:28:43.480168    2822 main.go:141] libmachine: (functional-593000) Calling .GetSSHUsername
I0831 15:28:43.480259    2822 sshutil.go:53] new ssh client: &{IP:192.169.0.4 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/functional-593000/id_rsa Username:docker}
I0831 15:28:43.520067    2822 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I0831 15:28:43.567653    2822 main.go:141] libmachine: Making call to close driver server
I0831 15:28:43.567663    2822 main.go:141] libmachine: (functional-593000) Calling .Close
I0831 15:28:43.567817    2822 main.go:141] libmachine: Successfully made call to close driver server
I0831 15:28:43.567829    2822 main.go:141] libmachine: Making call to close connection to plugin binary
I0831 15:28:43.567836    2822 main.go:141] libmachine: Making call to close driver server
I0831 15:28:43.567841    2822 main.go:141] libmachine: (functional-593000) DBG | Closing plugin on server side
I0831 15:28:43.567843    2822 main.go:141] libmachine: (functional-593000) Calling .Close
I0831 15:28:43.567998    2822 main.go:141] libmachine: (functional-593000) DBG | Closing plugin on server side
I0831 15:28:43.568026    2822 main.go:141] libmachine: Successfully made call to close driver server
I0831 15:28:43.568040    2822 main.go:141] libmachine: Making call to close connection to plugin binary
--- PASS: TestFunctional/parallel/ImageCommands/ImageListShort (0.19s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListTable (0.16s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListTable
functional_test.go:261: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 image ls --format table --alsologtostderr
functional_test.go:266: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-593000 image ls --format table --alsologtostderr:
|---------------------------------------------|-------------------|---------------|--------|
|                    Image                    |        Tag        |   Image ID    |  Size  |
|---------------------------------------------|-------------------|---------------|--------|
| registry.k8s.io/kube-controller-manager     | v1.31.0           | 045733566833c | 88.4MB |
| docker.io/library/nginx                     | alpine            | 0f0eda053dc5c | 43.3MB |
| docker.io/kubernetesui/dashboard            | <none>            | 07655ddf2eebe | 246MB  |
| registry.k8s.io/pause                       | 3.3               | 0184c1613d929 | 683kB  |
| registry.k8s.io/pause                       | 3.1               | da86e6ba6ca19 | 742kB  |
| docker.io/library/minikube-local-cache-test | functional-593000 | 1c0c1c492baa2 | 30B    |
| registry.k8s.io/kube-apiserver              | v1.31.0           | 604f5db92eaa8 | 94.2MB |
| registry.k8s.io/etcd                        | 3.5.15-0          | 2e96e5913fc06 | 148MB  |
| docker.io/kicbase/echo-server               | functional-593000 | 9056ab77afb8e | 4.94MB |
| registry.k8s.io/echoserver                  | 1.8               | 82e4c8a736a4f | 95.4MB |
| registry.k8s.io/pause                       | latest            | 350b164e7ae1d | 240kB  |
| docker.io/library/nginx                     | latest            | 5ef79149e0ec8 | 188MB  |
| registry.k8s.io/kube-scheduler              | v1.31.0           | 1766f54c897f0 | 67.4MB |
| registry.k8s.io/kube-proxy                  | v1.31.0           | ad83b2ca7b09e | 91.5MB |
| registry.k8s.io/pause                       | 3.10              | 873ed75102791 | 736kB  |
| registry.k8s.io/coredns/coredns             | v1.11.1           | cbb01a7bd410d | 59.8MB |
| docker.io/kubernetesui/metrics-scraper      | <none>            | 115053965e86b | 43.8MB |
| gcr.io/k8s-minikube/storage-provisioner     | v5                | 6e38f40d628db | 31.5MB |
| gcr.io/k8s-minikube/busybox                 | 1.28.4-glibc      | 56cc512116c8f | 4.4MB  |
| localhost/my-image                          | functional-593000 | 755402448f640 | 1.24MB |
|---------------------------------------------|-------------------|---------------|--------|
functional_test.go:269: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-593000 image ls --format table --alsologtostderr:
I0831 15:28:46.669056    2847 out.go:345] Setting OutFile to fd 1 ...
I0831 15:28:46.669337    2847 out.go:392] TERM=,COLORTERM=, which probably does not support color
I0831 15:28:46.669342    2847 out.go:358] Setting ErrFile to fd 2...
I0831 15:28:46.669346    2847 out.go:392] TERM=,COLORTERM=, which probably does not support color
I0831 15:28:46.669517    2847 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
I0831 15:28:46.670833    2847 config.go:182] Loaded profile config "functional-593000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
I0831 15:28:46.670932    2847 config.go:182] Loaded profile config "functional-593000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
I0831 15:28:46.671274    2847 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0831 15:28:46.671317    2847 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0831 15:28:46.679662    2847 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51001
I0831 15:28:46.680115    2847 main.go:141] libmachine: () Calling .GetVersion
I0831 15:28:46.680524    2847 main.go:141] libmachine: Using API Version  1
I0831 15:28:46.680533    2847 main.go:141] libmachine: () Calling .SetConfigRaw
I0831 15:28:46.680793    2847 main.go:141] libmachine: () Calling .GetMachineName
I0831 15:28:46.680917    2847 main.go:141] libmachine: (functional-593000) Calling .GetState
I0831 15:28:46.681010    2847 main.go:141] libmachine: (functional-593000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0831 15:28:46.681110    2847 main.go:141] libmachine: (functional-593000) DBG | hyperkit pid from json: 2119
I0831 15:28:46.682340    2847 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0831 15:28:46.682368    2847 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0831 15:28:46.690692    2847 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51003
I0831 15:28:46.691066    2847 main.go:141] libmachine: () Calling .GetVersion
I0831 15:28:46.691457    2847 main.go:141] libmachine: Using API Version  1
I0831 15:28:46.691479    2847 main.go:141] libmachine: () Calling .SetConfigRaw
I0831 15:28:46.691687    2847 main.go:141] libmachine: () Calling .GetMachineName
I0831 15:28:46.691798    2847 main.go:141] libmachine: (functional-593000) Calling .DriverName
I0831 15:28:46.691973    2847 ssh_runner.go:195] Run: systemctl --version
I0831 15:28:46.691989    2847 main.go:141] libmachine: (functional-593000) Calling .GetSSHHostname
I0831 15:28:46.692079    2847 main.go:141] libmachine: (functional-593000) Calling .GetSSHPort
I0831 15:28:46.692162    2847 main.go:141] libmachine: (functional-593000) Calling .GetSSHKeyPath
I0831 15:28:46.692247    2847 main.go:141] libmachine: (functional-593000) Calling .GetSSHUsername
I0831 15:28:46.692326    2847 sshutil.go:53] new ssh client: &{IP:192.169.0.4 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/functional-593000/id_rsa Username:docker}
I0831 15:28:46.726508    2847 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I0831 15:28:46.753452    2847 main.go:141] libmachine: Making call to close driver server
I0831 15:28:46.753462    2847 main.go:141] libmachine: (functional-593000) Calling .Close
I0831 15:28:46.753627    2847 main.go:141] libmachine: Successfully made call to close driver server
I0831 15:28:46.753636    2847 main.go:141] libmachine: Making call to close connection to plugin binary
I0831 15:28:46.753641    2847 main.go:141] libmachine: Making call to close driver server
I0831 15:28:46.753648    2847 main.go:141] libmachine: (functional-593000) Calling .Close
I0831 15:28:46.753658    2847 main.go:141] libmachine: (functional-593000) DBG | Closing plugin on server side
I0831 15:28:46.753809    2847 main.go:141] libmachine: (functional-593000) DBG | Closing plugin on server side
I0831 15:28:46.753841    2847 main.go:141] libmachine: Successfully made call to close driver server
I0831 15:28:46.753849    2847 main.go:141] libmachine: Making call to close connection to plugin binary
--- PASS: TestFunctional/parallel/ImageCommands/ImageListTable (0.16s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListJson (0.15s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListJson
functional_test.go:261: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 image ls --format json --alsologtostderr
functional_test.go:266: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-593000 image ls --format json --alsologtostderr:
[{"id":"604f5db92eaa823d11c141d8825f1460206f6bf29babca2a909a698dc22055d3","repoDigests":[],"repoTags":["registry.k8s.io/kube-apiserver:v1.31.0"],"size":"94200000"},{"id":"9056ab77afb8e18e04303f11000a9d31b3f16b74c59475b899ae1b342d328d30","repoDigests":[],"repoTags":["docker.io/kicbase/echo-server:functional-593000"],"size":"4940000"},{"id":"0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.3"],"size":"683000"},{"id":"da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.1"],"size":"742000"},{"id":"0f0eda053dc5c4c8240f11542cb4d200db6a11d476a4189b1eb0a3afa5684a9a","repoDigests":[],"repoTags":["docker.io/library/nginx:alpine"],"size":"43300000"},{"id":"1c0c1c492baa2b2e8c81f685295d3e5f2c61345a10696b9b5a8f2a53d0f6b84d","repoDigests":[],"repoTags":["docker.io/library/minikube-local-cache-test:functional-593000"],"size":"30"},{"id":"5ef79149e0ec84a7a9f9284c3f91aa3c20608f8391f
5445eabe92ef07dbda03c","repoDigests":[],"repoTags":["docker.io/library/nginx:latest"],"size":"188000000"},{"id":"1766f54c897f0e57040741e6741462f2e3a7d754705f446c9f729c7e1230fb94","repoDigests":[],"repoTags":["registry.k8s.io/kube-scheduler:v1.31.0"],"size":"67400000"},{"id":"6e38f40d628db3002f5617342c8872c935de530d867d0f709a2fbda1a302a562","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"31500000"},{"id":"755402448f64007245df3203914d59b2685562990c126034de5197c54b0d589e","repoDigests":[],"repoTags":["localhost/my-image:functional-593000"],"size":"1240000"},{"id":"2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4","repoDigests":[],"repoTags":["registry.k8s.io/etcd:3.5.15-0"],"size":"148000000"},{"id":"873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.10"],"size":"736000"},{"id":"cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4","repoDigests":[],"repoTags":["registry.k8s.io/cored
ns/coredns:v1.11.1"],"size":"59800000"},{"id":"115053965e86b2df4d78af78d7951b8644839d20a03820c6df59a261103315f7","repoDigests":[],"repoTags":["docker.io/kubernetesui/metrics-scraper:\u003cnone\u003e"],"size":"43800000"},{"id":"56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/busybox:1.28.4-glibc"],"size":"4400000"},{"id":"82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410","repoDigests":[],"repoTags":["registry.k8s.io/echoserver:1.8"],"size":"95400000"},{"id":"045733566833c40b15806c9b87d27f08e455e069833752e0e6ad7a76d37cb2b1","repoDigests":[],"repoTags":["registry.k8s.io/kube-controller-manager:v1.31.0"],"size":"88400000"},{"id":"07655ddf2eebe5d250f7a72c25f638b27126805d61779741b4e62e69ba080558","repoDigests":[],"repoTags":["docker.io/kubernetesui/dashboard:\u003cnone\u003e"],"size":"246000000"},{"id":"350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06","repoDigests":[],"repoTags":["registry.k8s.io/pause:latest"],"siz
e":"240000"},{"id":"ad83b2ca7b09e6162f96f933eecded731cbebf049c78f941fd0ce560a86b6494","repoDigests":[],"repoTags":["registry.k8s.io/kube-proxy:v1.31.0"],"size":"91500000"}]
functional_test.go:269: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-593000 image ls --format json --alsologtostderr:
I0831 15:28:46.517066    2843 out.go:345] Setting OutFile to fd 1 ...
I0831 15:28:46.517353    2843 out.go:392] TERM=,COLORTERM=, which probably does not support color
I0831 15:28:46.517358    2843 out.go:358] Setting ErrFile to fd 2...
I0831 15:28:46.517362    2843 out.go:392] TERM=,COLORTERM=, which probably does not support color
I0831 15:28:46.517550    2843 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
I0831 15:28:46.518123    2843 config.go:182] Loaded profile config "functional-593000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
I0831 15:28:46.518216    2843 config.go:182] Loaded profile config "functional-593000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
I0831 15:28:46.518553    2843 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0831 15:28:46.518593    2843 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0831 15:28:46.526817    2843 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50996
I0831 15:28:46.527221    2843 main.go:141] libmachine: () Calling .GetVersion
I0831 15:28:46.527631    2843 main.go:141] libmachine: Using API Version  1
I0831 15:28:46.527662    2843 main.go:141] libmachine: () Calling .SetConfigRaw
I0831 15:28:46.527873    2843 main.go:141] libmachine: () Calling .GetMachineName
I0831 15:28:46.527983    2843 main.go:141] libmachine: (functional-593000) Calling .GetState
I0831 15:28:46.528073    2843 main.go:141] libmachine: (functional-593000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0831 15:28:46.528136    2843 main.go:141] libmachine: (functional-593000) DBG | hyperkit pid from json: 2119
I0831 15:28:46.529373    2843 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0831 15:28:46.529393    2843 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0831 15:28:46.537658    2843 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50998
I0831 15:28:46.538017    2843 main.go:141] libmachine: () Calling .GetVersion
I0831 15:28:46.538398    2843 main.go:141] libmachine: Using API Version  1
I0831 15:28:46.538415    2843 main.go:141] libmachine: () Calling .SetConfigRaw
I0831 15:28:46.538621    2843 main.go:141] libmachine: () Calling .GetMachineName
I0831 15:28:46.538724    2843 main.go:141] libmachine: (functional-593000) Calling .DriverName
I0831 15:28:46.538876    2843 ssh_runner.go:195] Run: systemctl --version
I0831 15:28:46.538891    2843 main.go:141] libmachine: (functional-593000) Calling .GetSSHHostname
I0831 15:28:46.538961    2843 main.go:141] libmachine: (functional-593000) Calling .GetSSHPort
I0831 15:28:46.539040    2843 main.go:141] libmachine: (functional-593000) Calling .GetSSHKeyPath
I0831 15:28:46.539124    2843 main.go:141] libmachine: (functional-593000) Calling .GetSSHUsername
I0831 15:28:46.539216    2843 sshutil.go:53] new ssh client: &{IP:192.169.0.4 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/functional-593000/id_rsa Username:docker}
I0831 15:28:46.569254    2843 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I0831 15:28:46.589074    2843 main.go:141] libmachine: Making call to close driver server
I0831 15:28:46.589089    2843 main.go:141] libmachine: (functional-593000) Calling .Close
I0831 15:28:46.589238    2843 main.go:141] libmachine: Successfully made call to close driver server
I0831 15:28:46.589247    2843 main.go:141] libmachine: Making call to close connection to plugin binary
I0831 15:28:46.589252    2843 main.go:141] libmachine: Making call to close driver server
I0831 15:28:46.589257    2843 main.go:141] libmachine: (functional-593000) Calling .Close
I0831 15:28:46.589408    2843 main.go:141] libmachine: Successfully made call to close driver server
I0831 15:28:46.589415    2843 main.go:141] libmachine: Making call to close connection to plugin binary
I0831 15:28:46.589435    2843 main.go:141] libmachine: (functional-593000) DBG | Closing plugin on server side
--- PASS: TestFunctional/parallel/ImageCommands/ImageListJson (0.15s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListYaml (0.2s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListYaml
functional_test.go:261: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 image ls --format yaml --alsologtostderr
functional_test.go:266: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-593000 image ls --format yaml --alsologtostderr:
- id: cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4
repoDigests: []
repoTags:
- registry.k8s.io/coredns/coredns:v1.11.1
size: "59800000"
- id: 07655ddf2eebe5d250f7a72c25f638b27126805d61779741b4e62e69ba080558
repoDigests: []
repoTags:
- docker.io/kubernetesui/dashboard:<none>
size: "246000000"
- id: 115053965e86b2df4d78af78d7951b8644839d20a03820c6df59a261103315f7
repoDigests: []
repoTags:
- docker.io/kubernetesui/metrics-scraper:<none>
size: "43800000"
- id: 6e38f40d628db3002f5617342c8872c935de530d867d0f709a2fbda1a302a562
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "31500000"
- id: 0f0eda053dc5c4c8240f11542cb4d200db6a11d476a4189b1eb0a3afa5684a9a
repoDigests: []
repoTags:
- docker.io/library/nginx:alpine
size: "43300000"
- id: 1766f54c897f0e57040741e6741462f2e3a7d754705f446c9f729c7e1230fb94
repoDigests: []
repoTags:
- registry.k8s.io/kube-scheduler:v1.31.0
size: "67400000"
- id: 873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.10
size: "736000"
- id: 1c0c1c492baa2b2e8c81f685295d3e5f2c61345a10696b9b5a8f2a53d0f6b84d
repoDigests: []
repoTags:
- docker.io/library/minikube-local-cache-test:functional-593000
size: "30"
- id: 2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4
repoDigests: []
repoTags:
- registry.k8s.io/etcd:3.5.15-0
size: "148000000"
- id: 350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06
repoDigests: []
repoTags:
- registry.k8s.io/pause:latest
size: "240000"
- id: ad83b2ca7b09e6162f96f933eecded731cbebf049c78f941fd0ce560a86b6494
repoDigests: []
repoTags:
- registry.k8s.io/kube-proxy:v1.31.0
size: "91500000"
- id: 0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.3
size: "683000"
- id: 56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/busybox:1.28.4-glibc
size: "4400000"
- id: da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.1
size: "742000"
- id: 82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410
repoDigests: []
repoTags:
- registry.k8s.io/echoserver:1.8
size: "95400000"
- id: 5ef79149e0ec84a7a9f9284c3f91aa3c20608f8391f5445eabe92ef07dbda03c
repoDigests: []
repoTags:
- docker.io/library/nginx:latest
size: "188000000"
- id: 604f5db92eaa823d11c141d8825f1460206f6bf29babca2a909a698dc22055d3
repoDigests: []
repoTags:
- registry.k8s.io/kube-apiserver:v1.31.0
size: "94200000"
- id: 045733566833c40b15806c9b87d27f08e455e069833752e0e6ad7a76d37cb2b1
repoDigests: []
repoTags:
- registry.k8s.io/kube-controller-manager:v1.31.0
size: "88400000"
- id: 9056ab77afb8e18e04303f11000a9d31b3f16b74c59475b899ae1b342d328d30
repoDigests: []
repoTags:
- docker.io/kicbase/echo-server:functional-593000
size: "4940000"

                                                
                                                
functional_test.go:269: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-593000 image ls --format yaml --alsologtostderr:
I0831 15:28:43.649872    2826 out.go:345] Setting OutFile to fd 1 ...
I0831 15:28:43.650150    2826 out.go:392] TERM=,COLORTERM=, which probably does not support color
I0831 15:28:43.650156    2826 out.go:358] Setting ErrFile to fd 2...
I0831 15:28:43.650160    2826 out.go:392] TERM=,COLORTERM=, which probably does not support color
I0831 15:28:43.650349    2826 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
I0831 15:28:43.650949    2826 config.go:182] Loaded profile config "functional-593000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
I0831 15:28:43.651045    2826 config.go:182] Loaded profile config "functional-593000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
I0831 15:28:43.651400    2826 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0831 15:28:43.651443    2826 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0831 15:28:43.660098    2826 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50974
I0831 15:28:43.660539    2826 main.go:141] libmachine: () Calling .GetVersion
I0831 15:28:43.661002    2826 main.go:141] libmachine: Using API Version  1
I0831 15:28:43.661012    2826 main.go:141] libmachine: () Calling .SetConfigRaw
I0831 15:28:43.661276    2826 main.go:141] libmachine: () Calling .GetMachineName
I0831 15:28:43.661393    2826 main.go:141] libmachine: (functional-593000) Calling .GetState
I0831 15:28:43.661477    2826 main.go:141] libmachine: (functional-593000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0831 15:28:43.661544    2826 main.go:141] libmachine: (functional-593000) DBG | hyperkit pid from json: 2119
I0831 15:28:43.662903    2826 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0831 15:28:43.662946    2826 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0831 15:28:43.671545    2826 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50976
I0831 15:28:43.671923    2826 main.go:141] libmachine: () Calling .GetVersion
I0831 15:28:43.672307    2826 main.go:141] libmachine: Using API Version  1
I0831 15:28:43.672321    2826 main.go:141] libmachine: () Calling .SetConfigRaw
I0831 15:28:43.672562    2826 main.go:141] libmachine: () Calling .GetMachineName
I0831 15:28:43.672712    2826 main.go:141] libmachine: (functional-593000) Calling .DriverName
I0831 15:28:43.672919    2826 ssh_runner.go:195] Run: systemctl --version
I0831 15:28:43.672936    2826 main.go:141] libmachine: (functional-593000) Calling .GetSSHHostname
I0831 15:28:43.673029    2826 main.go:141] libmachine: (functional-593000) Calling .GetSSHPort
I0831 15:28:43.673131    2826 main.go:141] libmachine: (functional-593000) Calling .GetSSHKeyPath
I0831 15:28:43.673230    2826 main.go:141] libmachine: (functional-593000) Calling .GetSSHUsername
I0831 15:28:43.673319    2826 sshutil.go:53] new ssh client: &{IP:192.169.0.4 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/functional-593000/id_rsa Username:docker}
I0831 15:28:43.712880    2826 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I0831 15:28:43.767007    2826 main.go:141] libmachine: Making call to close driver server
I0831 15:28:43.767017    2826 main.go:141] libmachine: (functional-593000) Calling .Close
I0831 15:28:43.767171    2826 main.go:141] libmachine: Successfully made call to close driver server
I0831 15:28:43.767180    2826 main.go:141] libmachine: Making call to close connection to plugin binary
I0831 15:28:43.767185    2826 main.go:141] libmachine: Making call to close driver server
I0831 15:28:43.767192    2826 main.go:141] libmachine: (functional-593000) Calling .Close
I0831 15:28:43.767195    2826 main.go:141] libmachine: (functional-593000) DBG | Closing plugin on server side
I0831 15:28:43.767317    2826 main.go:141] libmachine: Successfully made call to close driver server
I0831 15:28:43.767324    2826 main.go:141] libmachine: Making call to close connection to plugin binary
--- PASS: TestFunctional/parallel/ImageCommands/ImageListYaml (0.20s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageBuild (2.67s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctional/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageBuild
functional_test.go:308: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 ssh pgrep buildkitd
functional_test.go:308: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-593000 ssh pgrep buildkitd: exit status 1 (137.376522ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:315: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 image build -t localhost/my-image:functional-593000 testdata/build --alsologtostderr
functional_test.go:315: (dbg) Done: out/minikube-darwin-amd64 -p functional-593000 image build -t localhost/my-image:functional-593000 testdata/build --alsologtostderr: (2.377130231s)
functional_test.go:323: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-593000 image build -t localhost/my-image:functional-593000 testdata/build --alsologtostderr:
I0831 15:28:43.987848    2835 out.go:345] Setting OutFile to fd 1 ...
I0831 15:28:43.988217    2835 out.go:392] TERM=,COLORTERM=, which probably does not support color
I0831 15:28:43.988223    2835 out.go:358] Setting ErrFile to fd 2...
I0831 15:28:43.988227    2835 out.go:392] TERM=,COLORTERM=, which probably does not support color
I0831 15:28:43.988411    2835 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
I0831 15:28:43.989003    2835 config.go:182] Loaded profile config "functional-593000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
I0831 15:28:43.990154    2835 config.go:182] Loaded profile config "functional-593000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
I0831 15:28:43.990504    2835 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0831 15:28:43.990540    2835 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0831 15:28:43.998997    2835 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50986
I0831 15:28:43.999441    2835 main.go:141] libmachine: () Calling .GetVersion
I0831 15:28:43.999855    2835 main.go:141] libmachine: Using API Version  1
I0831 15:28:43.999867    2835 main.go:141] libmachine: () Calling .SetConfigRaw
I0831 15:28:44.000174    2835 main.go:141] libmachine: () Calling .GetMachineName
I0831 15:28:44.000310    2835 main.go:141] libmachine: (functional-593000) Calling .GetState
I0831 15:28:44.000402    2835 main.go:141] libmachine: (functional-593000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0831 15:28:44.000472    2835 main.go:141] libmachine: (functional-593000) DBG | hyperkit pid from json: 2119
I0831 15:28:44.001754    2835 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0831 15:28:44.001803    2835 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0831 15:28:44.010570    2835 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50988
I0831 15:28:44.010988    2835 main.go:141] libmachine: () Calling .GetVersion
I0831 15:28:44.011321    2835 main.go:141] libmachine: Using API Version  1
I0831 15:28:44.011331    2835 main.go:141] libmachine: () Calling .SetConfigRaw
I0831 15:28:44.011584    2835 main.go:141] libmachine: () Calling .GetMachineName
I0831 15:28:44.011710    2835 main.go:141] libmachine: (functional-593000) Calling .DriverName
I0831 15:28:44.011876    2835 ssh_runner.go:195] Run: systemctl --version
I0831 15:28:44.011894    2835 main.go:141] libmachine: (functional-593000) Calling .GetSSHHostname
I0831 15:28:44.011990    2835 main.go:141] libmachine: (functional-593000) Calling .GetSSHPort
I0831 15:28:44.012083    2835 main.go:141] libmachine: (functional-593000) Calling .GetSSHKeyPath
I0831 15:28:44.012166    2835 main.go:141] libmachine: (functional-593000) Calling .GetSSHUsername
I0831 15:28:44.012254    2835 sshutil.go:53] new ssh client: &{IP:192.169.0.4 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/functional-593000/id_rsa Username:docker}
I0831 15:28:44.052924    2835 build_images.go:161] Building image from path: /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/build.882017342.tar
I0831 15:28:44.053055    2835 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build
I0831 15:28:44.064974    2835 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/build/build.882017342.tar
I0831 15:28:44.069220    2835 ssh_runner.go:352] existence check for /var/lib/minikube/build/build.882017342.tar: stat -c "%s %y" /var/lib/minikube/build/build.882017342.tar: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/build/build.882017342.tar': No such file or directory
I0831 15:28:44.069254    2835 ssh_runner.go:362] scp /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/build.882017342.tar --> /var/lib/minikube/build/build.882017342.tar (3072 bytes)
I0831 15:28:44.108389    2835 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build/build.882017342
I0831 15:28:44.118705    2835 ssh_runner.go:195] Run: sudo tar -C /var/lib/minikube/build/build.882017342 -xf /var/lib/minikube/build/build.882017342.tar
I0831 15:28:44.129176    2835 docker.go:360] Building image: /var/lib/minikube/build/build.882017342
I0831 15:28:44.129241    2835 ssh_runner.go:195] Run: docker build -t localhost/my-image:functional-593000 /var/lib/minikube/build/build.882017342
#0 building with "default" instance using docker driver

                                                
                                                
#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 97B done
#1 DONE 0.0s

                                                
                                                
#2 [internal] load metadata for gcr.io/k8s-minikube/busybox:latest
#2 DONE 1.0s

                                                
                                                
#3 [internal] load .dockerignore
#3 transferring context: 2B done
#3 DONE 0.0s

                                                
                                                
#4 [internal] load build context
#4 transferring context: 62B done
#4 DONE 0.0s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 resolve gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b done
#5 sha256:62ffc2ed7554e4c6d360bce40bbcf196573dd27c4ce080641a2c59867e732dee 527B / 527B done
#5 sha256:beae173ccac6ad749f76713cf4440fe3d21d1043fe616dfbe30775815d1d0f6a 1.46kB / 1.46kB done
#5 sha256:5cc84ad355aaa64f46ea9c7bbcc319a9d808ab15088a27209c9e70ef86e5a2aa 0B / 772.79kB 0.1s
#5 sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b 770B / 770B done
#5 sha256:5cc84ad355aaa64f46ea9c7bbcc319a9d808ab15088a27209c9e70ef86e5a2aa 772.79kB / 772.79kB 0.3s done
#5 extracting sha256:5cc84ad355aaa64f46ea9c7bbcc319a9d808ab15088a27209c9e70ef86e5a2aa
#5 extracting sha256:5cc84ad355aaa64f46ea9c7bbcc319a9d808ab15088a27209c9e70ef86e5a2aa 0.1s done
#5 DONE 0.5s

                                                
                                                
#6 [2/3] RUN true
#6 DONE 0.3s

                                                
                                                
#7 [3/3] ADD content.txt /
#7 DONE 0.0s

                                                
                                                
#8 exporting to image
#8 exporting layers 0.0s done
#8 writing image sha256:755402448f64007245df3203914d59b2685562990c126034de5197c54b0d589e done
#8 naming to localhost/my-image:functional-593000 done
#8 DONE 0.0s
I0831 15:28:46.260252    2835 ssh_runner.go:235] Completed: docker build -t localhost/my-image:functional-593000 /var/lib/minikube/build/build.882017342: (2.13096809s)
I0831 15:28:46.260317    2835 ssh_runner.go:195] Run: sudo rm -rf /var/lib/minikube/build/build.882017342
I0831 15:28:46.269657    2835 ssh_runner.go:195] Run: sudo rm -f /var/lib/minikube/build/build.882017342.tar
I0831 15:28:46.279031    2835 build_images.go:217] Built localhost/my-image:functional-593000 from /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/build.882017342.tar
I0831 15:28:46.279064    2835 build_images.go:133] succeeded building to: functional-593000
I0831 15:28:46.279069    2835 build_images.go:134] failed building to: 
I0831 15:28:46.279085    2835 main.go:141] libmachine: Making call to close driver server
I0831 15:28:46.279099    2835 main.go:141] libmachine: (functional-593000) Calling .Close
I0831 15:28:46.279247    2835 main.go:141] libmachine: Successfully made call to close driver server
I0831 15:28:46.279257    2835 main.go:141] libmachine: Making call to close connection to plugin binary
I0831 15:28:46.279264    2835 main.go:141] libmachine: Making call to close driver server
I0831 15:28:46.279274    2835 main.go:141] libmachine: (functional-593000) Calling .Close
I0831 15:28:46.279395    2835 main.go:141] libmachine: Successfully made call to close driver server
I0831 15:28:46.279406    2835 main.go:141] libmachine: Making call to close connection to plugin binary
I0831 15:28:46.279415    2835 main.go:141] libmachine: (functional-593000) DBG | Closing plugin on server side
functional_test.go:451: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageBuild (2.67s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/Setup (1.7s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/Setup
functional_test.go:342: (dbg) Run:  docker pull kicbase/echo-server:1.0
functional_test.go:342: (dbg) Done: docker pull kicbase/echo-server:1.0: (1.677108179s)
functional_test.go:347: (dbg) Run:  docker tag kicbase/echo-server:1.0 kicbase/echo-server:functional-593000
--- PASS: TestFunctional/parallel/ImageCommands/Setup (1.70s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadDaemon (0.9s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:355: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 image load --daemon kicbase/echo-server:functional-593000 --alsologtostderr
functional_test.go:451: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadDaemon (0.90s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageReloadDaemon (0.6s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:365: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 image load --daemon kicbase/echo-server:functional-593000 --alsologtostderr
functional_test.go:451: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageReloadDaemon (0.60s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (1.4s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:235: (dbg) Run:  docker pull kicbase/echo-server:latest
functional_test.go:240: (dbg) Run:  docker tag kicbase/echo-server:latest kicbase/echo-server:functional-593000
functional_test.go:245: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 image load --daemon kicbase/echo-server:functional-593000 --alsologtostderr
functional_test.go:451: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (1.40s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveToFile (0.29s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveToFile
functional_test.go:380: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 image save kicbase/echo-server:functional-593000 /Users/jenkins/workspace/echo-server-save.tar --alsologtostderr
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveToFile (0.29s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageRemove (0.33s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageRemove
functional_test.go:392: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 image rm kicbase/echo-server:functional-593000 --alsologtostderr
functional_test.go:451: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageRemove (0.33s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadFromFile (0.6s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:409: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 image load /Users/jenkins/workspace/echo-server-save.tar --alsologtostderr
functional_test.go:451: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadFromFile (0.60s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveDaemon (0.41s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:419: (dbg) Run:  docker rmi kicbase/echo-server:functional-593000
functional_test.go:424: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 image save --daemon kicbase/echo-server:functional-593000 --alsologtostderr
functional_test.go:432: (dbg) Run:  docker image inspect kicbase/echo-server:functional-593000
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveDaemon (0.41s)

                                                
                                    
x
+
TestFunctional/parallel/DockerEnv/bash (0.59s)

                                                
                                                
=== RUN   TestFunctional/parallel/DockerEnv/bash
functional_test.go:499: (dbg) Run:  /bin/bash -c "eval $(out/minikube-darwin-amd64 -p functional-593000 docker-env) && out/minikube-darwin-amd64 status -p functional-593000"
functional_test.go:522: (dbg) Run:  /bin/bash -c "eval $(out/minikube-darwin-amd64 -p functional-593000 docker-env) && docker images"
--- PASS: TestFunctional/parallel/DockerEnv/bash (0.59s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_changes (0.24s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_changes
functional_test.go:2119: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_changes (0.24s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.21s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2119: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.21s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_clusters (0.19s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_clusters
functional_test.go:2119: (dbg) Run:  out/minikube-darwin-amd64 -p functional-593000 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_clusters (0.19s)

                                                
                                    
x
+
TestFunctional/delete_echo-server_images (0.04s)

                                                
                                                
=== RUN   TestFunctional/delete_echo-server_images
functional_test.go:190: (dbg) Run:  docker rmi -f kicbase/echo-server:1.0
functional_test.go:190: (dbg) Run:  docker rmi -f kicbase/echo-server:functional-593000
--- PASS: TestFunctional/delete_echo-server_images (0.04s)

                                                
                                    
x
+
TestFunctional/delete_my-image_image (0.02s)

                                                
                                                
=== RUN   TestFunctional/delete_my-image_image
functional_test.go:198: (dbg) Run:  docker rmi -f localhost/my-image:functional-593000
--- PASS: TestFunctional/delete_my-image_image (0.02s)

                                                
                                    
x
+
TestFunctional/delete_minikube_cached_images (0.02s)

                                                
                                                
=== RUN   TestFunctional/delete_minikube_cached_images
functional_test.go:206: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-593000
--- PASS: TestFunctional/delete_minikube_cached_images (0.02s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StartCluster (190.45s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StartCluster
ha_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p ha-949000 --wait=true --memory=2200 --ha -v=7 --alsologtostderr --driver=hyperkit 
E0831 15:29:15.438258    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:29:43.156085    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
ha_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p ha-949000 --wait=true --memory=2200 --ha -v=7 --alsologtostderr --driver=hyperkit : (3m10.064914489s)
ha_test.go:107: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr
--- PASS: TestMultiControlPlane/serial/StartCluster (190.45s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DeployApp (6.36s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DeployApp
ha_test.go:128: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-949000 -- apply -f ./testdata/ha/ha-pod-dns-test.yaml
ha_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-949000 -- rollout status deployment/busybox
ha_test.go:133: (dbg) Done: out/minikube-darwin-amd64 kubectl -p ha-949000 -- rollout status deployment/busybox: (4.049590138s)
ha_test.go:140: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-949000 -- get pods -o jsonpath='{.items[*].status.podIP}'
ha_test.go:163: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-949000 -- get pods -o jsonpath='{.items[*].metadata.name}'
ha_test.go:171: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-949000 -- exec busybox-7dff88458-5kkbw -- nslookup kubernetes.io
ha_test.go:171: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-949000 -- exec busybox-7dff88458-6r9s5 -- nslookup kubernetes.io
ha_test.go:171: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-949000 -- exec busybox-7dff88458-vjf9x -- nslookup kubernetes.io
ha_test.go:181: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-949000 -- exec busybox-7dff88458-5kkbw -- nslookup kubernetes.default
ha_test.go:181: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-949000 -- exec busybox-7dff88458-6r9s5 -- nslookup kubernetes.default
ha_test.go:181: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-949000 -- exec busybox-7dff88458-vjf9x -- nslookup kubernetes.default
ha_test.go:189: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-949000 -- exec busybox-7dff88458-5kkbw -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-949000 -- exec busybox-7dff88458-6r9s5 -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-949000 -- exec busybox-7dff88458-vjf9x -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiControlPlane/serial/DeployApp (6.36s)

                                                
                                    
x
+
TestMultiControlPlane/serial/PingHostFromPods (1.32s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/PingHostFromPods
ha_test.go:199: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-949000 -- get pods -o jsonpath='{.items[*].metadata.name}'
ha_test.go:207: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-949000 -- exec busybox-7dff88458-5kkbw -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-949000 -- exec busybox-7dff88458-5kkbw -- sh -c "ping -c 1 192.169.0.1"
ha_test.go:207: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-949000 -- exec busybox-7dff88458-6r9s5 -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-949000 -- exec busybox-7dff88458-6r9s5 -- sh -c "ping -c 1 192.169.0.1"
ha_test.go:207: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-949000 -- exec busybox-7dff88458-vjf9x -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-949000 -- exec busybox-7dff88458-vjf9x -- sh -c "ping -c 1 192.169.0.1"
--- PASS: TestMultiControlPlane/serial/PingHostFromPods (1.32s)

                                                
                                    
x
+
TestMultiControlPlane/serial/NodeLabels (0.05s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/NodeLabels
ha_test.go:255: (dbg) Run:  kubectl --context ha-949000 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiControlPlane/serial/NodeLabels (0.05s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterClusterStart (0.35s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterClusterStart
ha_test.go:281: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestMultiControlPlane/serial/HAppyAfterClusterStart (0.35s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop (0.27s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop
ha_test.go:390: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop (0.27s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart (0.34s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart
ha_test.go:281: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart (0.34s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete (0.26s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete
ha_test.go:390: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete (0.26s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StopCluster (18.99s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StopCluster
ha_test.go:531: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 stop -v=7 --alsologtostderr
E0831 15:42:52.723870    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
ha_test.go:531: (dbg) Done: out/minikube-darwin-amd64 -p ha-949000 stop -v=7 --alsologtostderr: (18.891947878s)
ha_test.go:537: (dbg) Run:  out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr
ha_test.go:537: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-949000 status -v=7 --alsologtostderr: exit status 7 (94.146332ms)

                                                
                                                
-- stdout --
	ha-949000
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-949000-m02
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-949000-m04
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 15:42:55.806017    3998 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:42:55.806198    3998 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:42:55.806204    3998 out.go:358] Setting ErrFile to fd 2...
	I0831 15:42:55.806208    3998 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:42:55.806380    3998 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:42:55.806553    3998 out.go:352] Setting JSON to false
	I0831 15:42:55.806578    3998 mustload.go:65] Loading cluster: ha-949000
	I0831 15:42:55.806615    3998 notify.go:220] Checking for updates...
	I0831 15:42:55.806891    3998 config.go:182] Loaded profile config "ha-949000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:42:55.806907    3998 status.go:255] checking status of ha-949000 ...
	I0831 15:42:55.807239    3998 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:55.807289    3998 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:42:55.816234    3998 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52039
	I0831 15:42:55.816735    3998 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:42:55.817200    3998 main.go:141] libmachine: Using API Version  1
	I0831 15:42:55.817215    3998 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:42:55.817519    3998 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:42:55.817695    3998 main.go:141] libmachine: (ha-949000) Calling .GetState
	I0831 15:42:55.817819    3998 main.go:141] libmachine: (ha-949000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:42:55.817873    3998 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid from json: 3756
	I0831 15:42:55.818815    3998 main.go:141] libmachine: (ha-949000) DBG | hyperkit pid 3756 missing from process table
	I0831 15:42:55.818867    3998 status.go:330] ha-949000 host status = "Stopped" (err=<nil>)
	I0831 15:42:55.818879    3998 status.go:343] host is not running, skipping remaining checks
	I0831 15:42:55.818885    3998 status.go:257] ha-949000 status: &{Name:ha-949000 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:42:55.818926    3998 status.go:255] checking status of ha-949000-m02 ...
	I0831 15:42:55.819204    3998 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:55.819250    3998 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:42:55.827943    3998 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52042
	I0831 15:42:55.828399    3998 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:42:55.828798    3998 main.go:141] libmachine: Using API Version  1
	I0831 15:42:55.828817    3998 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:42:55.829176    3998 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:42:55.829316    3998 main.go:141] libmachine: (ha-949000-m02) Calling .GetState
	I0831 15:42:55.829426    3998 main.go:141] libmachine: (ha-949000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:42:55.829477    3998 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid from json: 3763
	I0831 15:42:55.830468    3998 main.go:141] libmachine: (ha-949000-m02) DBG | hyperkit pid 3763 missing from process table
	I0831 15:42:55.830536    3998 status.go:330] ha-949000-m02 host status = "Stopped" (err=<nil>)
	I0831 15:42:55.830545    3998 status.go:343] host is not running, skipping remaining checks
	I0831 15:42:55.830552    3998 status.go:257] ha-949000-m02 status: &{Name:ha-949000-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:42:55.830564    3998 status.go:255] checking status of ha-949000-m04 ...
	I0831 15:42:55.830838    3998 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:42:55.830860    3998 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:42:55.839555    3998 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52044
	I0831 15:42:55.840009    3998 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:42:55.840465    3998 main.go:141] libmachine: Using API Version  1
	I0831 15:42:55.840478    3998 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:42:55.840761    3998 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:42:55.840882    3998 main.go:141] libmachine: (ha-949000-m04) Calling .GetState
	I0831 15:42:55.841025    3998 main.go:141] libmachine: (ha-949000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:42:55.841091    3998 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid from json: 3806
	I0831 15:42:55.842204    3998 main.go:141] libmachine: (ha-949000-m04) DBG | hyperkit pid 3806 missing from process table
	I0831 15:42:55.842230    3998 status.go:330] ha-949000-m04 host status = "Stopped" (err=<nil>)
	I0831 15:42:55.842240    3998 status.go:343] host is not running, skipping remaining checks
	I0831 15:42:55.842247    3998 status.go:257] ha-949000-m04 status: &{Name:ha-949000-m04 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiControlPlane/serial/StopCluster (18.99s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterClusterRestart (0.25s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterClusterRestart
ha_test.go:390: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestMultiControlPlane/serial/DegradedAfterClusterRestart (0.25s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd (0.33s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd
ha_test.go:281: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd (0.33s)

                                                
                                    
x
+
TestImageBuild/serial/Setup (37.94s)

                                                
                                                
=== RUN   TestImageBuild/serial/Setup
image_test.go:69: (dbg) Run:  out/minikube-darwin-amd64 start -p image-012000 --driver=hyperkit 
image_test.go:69: (dbg) Done: out/minikube-darwin-amd64 start -p image-012000 --driver=hyperkit : (37.935088068s)
--- PASS: TestImageBuild/serial/Setup (37.94s)

                                                
                                    
x
+
TestImageBuild/serial/NormalBuild (1.59s)

                                                
                                                
=== RUN   TestImageBuild/serial/NormalBuild
image_test.go:78: (dbg) Run:  out/minikube-darwin-amd64 image build -t aaa:latest ./testdata/image-build/test-normal -p image-012000
image_test.go:78: (dbg) Done: out/minikube-darwin-amd64 image build -t aaa:latest ./testdata/image-build/test-normal -p image-012000: (1.592378772s)
--- PASS: TestImageBuild/serial/NormalBuild (1.59s)

                                                
                                    
x
+
TestImageBuild/serial/BuildWithBuildArg (0.78s)

                                                
                                                
=== RUN   TestImageBuild/serial/BuildWithBuildArg
image_test.go:99: (dbg) Run:  out/minikube-darwin-amd64 image build -t aaa:latest --build-opt=build-arg=ENV_A=test_env_str --build-opt=no-cache ./testdata/image-build/test-arg -p image-012000
--- PASS: TestImageBuild/serial/BuildWithBuildArg (0.78s)

                                                
                                    
x
+
TestImageBuild/serial/BuildWithDockerIgnore (0.75s)

                                                
                                                
=== RUN   TestImageBuild/serial/BuildWithDockerIgnore
image_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 image build -t aaa:latest ./testdata/image-build/test-normal --build-opt=no-cache -p image-012000
--- PASS: TestImageBuild/serial/BuildWithDockerIgnore (0.75s)

                                                
                                    
x
+
TestImageBuild/serial/BuildWithSpecifiedDockerfile (0.63s)

                                                
                                                
=== RUN   TestImageBuild/serial/BuildWithSpecifiedDockerfile
image_test.go:88: (dbg) Run:  out/minikube-darwin-amd64 image build -t aaa:latest -f inner/Dockerfile ./testdata/image-build/test-f -p image-012000
--- PASS: TestImageBuild/serial/BuildWithSpecifiedDockerfile (0.63s)

                                                
                                    
x
+
TestJSONOutput/start/Command (75.05s)

                                                
                                                
=== RUN   TestJSONOutput/start/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 start -p json-output-880000 --output=json --user=testUser --memory=2200 --wait=true --driver=hyperkit 
E0831 15:52:52.710346    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
json_output_test.go:63: (dbg) Done: out/minikube-darwin-amd64 start -p json-output-880000 --output=json --user=testUser --memory=2200 --wait=true --driver=hyperkit : (1m15.048809637s)
--- PASS: TestJSONOutput/start/Command (75.05s)

                                                
                                    
x
+
TestJSONOutput/start/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/Audit
--- PASS: TestJSONOutput/start/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/start/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/start/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/Command (0.5s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 pause -p json-output-880000 --output=json --user=testUser
--- PASS: TestJSONOutput/pause/Command (0.50s)

                                                
                                    
x
+
TestJSONOutput/pause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Audit
--- PASS: TestJSONOutput/pause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/Command (0.45s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 unpause -p json-output-880000 --output=json --user=testUser
--- PASS: TestJSONOutput/unpause/Command (0.45s)

                                                
                                    
x
+
TestJSONOutput/unpause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Audit
--- PASS: TestJSONOutput/unpause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/Command (8.33s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 stop -p json-output-880000 --output=json --user=testUser
json_output_test.go:63: (dbg) Done: out/minikube-darwin-amd64 stop -p json-output-880000 --output=json --user=testUser: (8.333176666s)
--- PASS: TestJSONOutput/stop/Command (8.33s)

                                                
                                    
x
+
TestJSONOutput/stop/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Audit
--- PASS: TestJSONOutput/stop/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestErrorJSONOutput (0.58s)

                                                
                                                
=== RUN   TestErrorJSONOutput
json_output_test.go:160: (dbg) Run:  out/minikube-darwin-amd64 start -p json-output-error-757000 --memory=2200 --output=json --wait=true --driver=fail
json_output_test.go:160: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p json-output-error-757000 --memory=2200 --output=json --wait=true --driver=fail: exit status 56 (359.07711ms)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"7e1b8b0e-c041-46bb-9286-da71b66ca1b3","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[json-output-error-757000] minikube v1.33.1 on Darwin 14.6.1","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"22986693-55f7-4091-adeb-21395f1d6fb7","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=18943"}}
	{"specversion":"1.0","id":"a6e80870-8a3c-452d-91c2-dbd925a040d5","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig"}}
	{"specversion":"1.0","id":"39d286b3-9976-4b7e-812c-ebf70aac075a","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-darwin-amd64"}}
	{"specversion":"1.0","id":"9195752d-4e30-4f46-b3e8-4c6f0811f46f","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"becf122d-99f6-4fd1-99a8-19db47ade38d","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube"}}
	{"specversion":"1.0","id":"b8bac6a3-7f0a-44b1-9bce-3f6be41e9b51","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_FORCE_SYSTEMD="}}
	{"specversion":"1.0","id":"e3349642-926a-4946-8253-d5892f0c0e4a","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"","exitcode":"56","issues":"","message":"The driver 'fail' is not supported on darwin/amd64","name":"DRV_UNSUPPORTED_OS","url":""}}

                                                
                                                
-- /stdout --
helpers_test.go:176: Cleaning up "json-output-error-757000" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 delete -p json-output-error-757000
--- PASS: TestErrorJSONOutput (0.58s)

                                                
                                    
x
+
TestMainNoArgs (0.08s)

                                                
                                                
=== RUN   TestMainNoArgs
main_test.go:68: (dbg) Run:  out/minikube-darwin-amd64
--- PASS: TestMainNoArgs (0.08s)

                                                
                                    
x
+
TestMultiNode/serial/FreshStart2Nodes (106.52s)

                                                
                                                
=== RUN   TestMultiNode/serial/FreshStart2Nodes
multinode_test.go:96: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-957000 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=hyperkit 
E0831 15:57:18.515139    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 15:57:52.712579    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
multinode_test.go:96: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-957000 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=hyperkit : (1m46.285907077s)
multinode_test.go:102: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 status --alsologtostderr
--- PASS: TestMultiNode/serial/FreshStart2Nodes (106.52s)

                                                
                                    
x
+
TestMultiNode/serial/DeployApp2Nodes (5.14s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeployApp2Nodes
multinode_test.go:493: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-957000 -- apply -f ./testdata/multinodes/multinode-pod-dns-test.yaml
multinode_test.go:498: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-957000 -- rollout status deployment/busybox
multinode_test.go:498: (dbg) Done: out/minikube-darwin-amd64 kubectl -p multinode-957000 -- rollout status deployment/busybox: (3.124291668s)
multinode_test.go:505: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-957000 -- get pods -o jsonpath='{.items[*].status.podIP}'
multinode_test.go:528: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-957000 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:536: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-957000 -- exec busybox-7dff88458-9qs4p -- nslookup kubernetes.io
multinode_test.go:536: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-957000 -- exec busybox-7dff88458-rjh4x -- nslookup kubernetes.io
multinode_test.go:546: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-957000 -- exec busybox-7dff88458-9qs4p -- nslookup kubernetes.default
multinode_test.go:546: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-957000 -- exec busybox-7dff88458-rjh4x -- nslookup kubernetes.default
multinode_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-957000 -- exec busybox-7dff88458-9qs4p -- nslookup kubernetes.default.svc.cluster.local
multinode_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-957000 -- exec busybox-7dff88458-rjh4x -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiNode/serial/DeployApp2Nodes (5.14s)

                                                
                                    
x
+
TestMultiNode/serial/PingHostFrom2Pods (0.9s)

                                                
                                                
=== RUN   TestMultiNode/serial/PingHostFrom2Pods
multinode_test.go:564: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-957000 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:572: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-957000 -- exec busybox-7dff88458-9qs4p -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:583: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-957000 -- exec busybox-7dff88458-9qs4p -- sh -c "ping -c 1 192.169.0.1"
multinode_test.go:572: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-957000 -- exec busybox-7dff88458-rjh4x -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:583: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-957000 -- exec busybox-7dff88458-rjh4x -- sh -c "ping -c 1 192.169.0.1"
--- PASS: TestMultiNode/serial/PingHostFrom2Pods (0.90s)

                                                
                                    
x
+
TestMultiNode/serial/AddNode (45.76s)

                                                
                                                
=== RUN   TestMultiNode/serial/AddNode
multinode_test.go:121: (dbg) Run:  out/minikube-darwin-amd64 node add -p multinode-957000 -v 3 --alsologtostderr
E0831 15:59:15.434259    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
multinode_test.go:121: (dbg) Done: out/minikube-darwin-amd64 node add -p multinode-957000 -v 3 --alsologtostderr: (45.437807685s)
multinode_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 status --alsologtostderr
--- PASS: TestMultiNode/serial/AddNode (45.76s)

                                                
                                    
x
+
TestMultiNode/serial/MultiNodeLabels (0.05s)

                                                
                                                
=== RUN   TestMultiNode/serial/MultiNodeLabels
multinode_test.go:221: (dbg) Run:  kubectl --context multinode-957000 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiNode/serial/MultiNodeLabels (0.05s)

                                                
                                    
x
+
TestMultiNode/serial/ProfileList (0.17s)

                                                
                                                
=== RUN   TestMultiNode/serial/ProfileList
multinode_test.go:143: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestMultiNode/serial/ProfileList (0.17s)

                                                
                                    
x
+
TestMultiNode/serial/CopyFile (5.24s)

                                                
                                                
=== RUN   TestMultiNode/serial/CopyFile
multinode_test.go:184: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 status --output json --alsologtostderr
helpers_test.go:557: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 cp testdata/cp-test.txt multinode-957000:/home/docker/cp-test.txt
helpers_test.go:535: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 ssh -n multinode-957000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:557: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 cp multinode-957000:/home/docker/cp-test.txt /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestMultiNodeserialCopyFile749792849/001/cp-test_multinode-957000.txt
helpers_test.go:535: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 ssh -n multinode-957000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:557: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 cp multinode-957000:/home/docker/cp-test.txt multinode-957000-m02:/home/docker/cp-test_multinode-957000_multinode-957000-m02.txt
helpers_test.go:535: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 ssh -n multinode-957000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:535: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 ssh -n multinode-957000-m02 "sudo cat /home/docker/cp-test_multinode-957000_multinode-957000-m02.txt"
helpers_test.go:557: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 cp multinode-957000:/home/docker/cp-test.txt multinode-957000-m03:/home/docker/cp-test_multinode-957000_multinode-957000-m03.txt
helpers_test.go:535: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 ssh -n multinode-957000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:535: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 ssh -n multinode-957000-m03 "sudo cat /home/docker/cp-test_multinode-957000_multinode-957000-m03.txt"
helpers_test.go:557: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 cp testdata/cp-test.txt multinode-957000-m02:/home/docker/cp-test.txt
helpers_test.go:535: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 ssh -n multinode-957000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:557: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 cp multinode-957000-m02:/home/docker/cp-test.txt /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestMultiNodeserialCopyFile749792849/001/cp-test_multinode-957000-m02.txt
helpers_test.go:535: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 ssh -n multinode-957000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:557: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 cp multinode-957000-m02:/home/docker/cp-test.txt multinode-957000:/home/docker/cp-test_multinode-957000-m02_multinode-957000.txt
helpers_test.go:535: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 ssh -n multinode-957000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:535: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 ssh -n multinode-957000 "sudo cat /home/docker/cp-test_multinode-957000-m02_multinode-957000.txt"
helpers_test.go:557: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 cp multinode-957000-m02:/home/docker/cp-test.txt multinode-957000-m03:/home/docker/cp-test_multinode-957000-m02_multinode-957000-m03.txt
helpers_test.go:535: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 ssh -n multinode-957000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:535: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 ssh -n multinode-957000-m03 "sudo cat /home/docker/cp-test_multinode-957000-m02_multinode-957000-m03.txt"
helpers_test.go:557: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 cp testdata/cp-test.txt multinode-957000-m03:/home/docker/cp-test.txt
helpers_test.go:535: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 ssh -n multinode-957000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:557: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 cp multinode-957000-m03:/home/docker/cp-test.txt /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestMultiNodeserialCopyFile749792849/001/cp-test_multinode-957000-m03.txt
helpers_test.go:535: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 ssh -n multinode-957000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:557: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 cp multinode-957000-m03:/home/docker/cp-test.txt multinode-957000:/home/docker/cp-test_multinode-957000-m03_multinode-957000.txt
helpers_test.go:535: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 ssh -n multinode-957000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:535: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 ssh -n multinode-957000 "sudo cat /home/docker/cp-test_multinode-957000-m03_multinode-957000.txt"
helpers_test.go:557: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 cp multinode-957000-m03:/home/docker/cp-test.txt multinode-957000-m02:/home/docker/cp-test_multinode-957000-m03_multinode-957000-m02.txt
helpers_test.go:535: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 ssh -n multinode-957000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:535: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 ssh -n multinode-957000-m02 "sudo cat /home/docker/cp-test_multinode-957000-m03_multinode-957000-m02.txt"
--- PASS: TestMultiNode/serial/CopyFile (5.24s)

                                                
                                    
x
+
TestMultiNode/serial/StopNode (2.84s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopNode
multinode_test.go:248: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 node stop m03
multinode_test.go:248: (dbg) Done: out/minikube-darwin-amd64 -p multinode-957000 node stop m03: (2.338254427s)
multinode_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 status
multinode_test.go:254: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-957000 status: exit status 7 (248.856601ms)

                                                
                                                
-- stdout --
	multinode-957000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-957000-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-957000-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:261: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 status --alsologtostderr
multinode_test.go:261: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-957000 status --alsologtostderr: exit status 7 (251.314369ms)

                                                
                                                
-- stdout --
	multinode-957000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-957000-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-957000-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0831 15:59:43.472966    4874 out.go:345] Setting OutFile to fd 1 ...
	I0831 15:59:43.473241    4874 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:59:43.473247    4874 out.go:358] Setting ErrFile to fd 2...
	I0831 15:59:43.473251    4874 out.go:392] TERM=,COLORTERM=, which probably does not support color
	I0831 15:59:43.473422    4874 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18943-957/.minikube/bin
	I0831 15:59:43.473600    4874 out.go:352] Setting JSON to false
	I0831 15:59:43.473622    4874 mustload.go:65] Loading cluster: multinode-957000
	I0831 15:59:43.473660    4874 notify.go:220] Checking for updates...
	I0831 15:59:43.473936    4874 config.go:182] Loaded profile config "multinode-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.31.0
	I0831 15:59:43.473951    4874 status.go:255] checking status of multinode-957000 ...
	I0831 15:59:43.474319    4874 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:59:43.474367    4874 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:59:43.483251    4874 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52954
	I0831 15:59:43.483634    4874 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:59:43.484053    4874 main.go:141] libmachine: Using API Version  1
	I0831 15:59:43.484063    4874 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:59:43.484304    4874 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:59:43.484420    4874 main.go:141] libmachine: (multinode-957000) Calling .GetState
	I0831 15:59:43.484515    4874 main.go:141] libmachine: (multinode-957000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:59:43.484582    4874 main.go:141] libmachine: (multinode-957000) DBG | hyperkit pid from json: 4580
	I0831 15:59:43.485746    4874 status.go:330] multinode-957000 host status = "Running" (err=<nil>)
	I0831 15:59:43.485769    4874 host.go:66] Checking if "multinode-957000" exists ...
	I0831 15:59:43.486009    4874 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:59:43.486032    4874 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:59:43.494362    4874 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52956
	I0831 15:59:43.494715    4874 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:59:43.495063    4874 main.go:141] libmachine: Using API Version  1
	I0831 15:59:43.495082    4874 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:59:43.495296    4874 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:59:43.501476    4874 main.go:141] libmachine: (multinode-957000) Calling .GetIP
	I0831 15:59:43.501603    4874 host.go:66] Checking if "multinode-957000" exists ...
	I0831 15:59:43.501854    4874 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:59:43.501873    4874 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:59:43.510309    4874 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52958
	I0831 15:59:43.510650    4874 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:59:43.510966    4874 main.go:141] libmachine: Using API Version  1
	I0831 15:59:43.510978    4874 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:59:43.511183    4874 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:59:43.511279    4874 main.go:141] libmachine: (multinode-957000) Calling .DriverName
	I0831 15:59:43.511411    4874 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:59:43.511431    4874 main.go:141] libmachine: (multinode-957000) Calling .GetSSHHostname
	I0831 15:59:43.511516    4874 main.go:141] libmachine: (multinode-957000) Calling .GetSSHPort
	I0831 15:59:43.511601    4874 main.go:141] libmachine: (multinode-957000) Calling .GetSSHKeyPath
	I0831 15:59:43.511682    4874 main.go:141] libmachine: (multinode-957000) Calling .GetSSHUsername
	I0831 15:59:43.511761    4874 sshutil.go:53] new ssh client: &{IP:192.169.0.13 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000/id_rsa Username:docker}
	I0831 15:59:43.542352    4874 ssh_runner.go:195] Run: systemctl --version
	I0831 15:59:43.547033    4874 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:59:43.558925    4874 kubeconfig.go:125] found "multinode-957000" server: "https://192.169.0.13:8443"
	I0831 15:59:43.558949    4874 api_server.go:166] Checking apiserver status ...
	I0831 15:59:43.558987    4874 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0831 15:59:43.569660    4874 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2010/cgroup
	W0831 15:59:43.576685    4874 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2010/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0831 15:59:43.576725    4874 ssh_runner.go:195] Run: ls
	I0831 15:59:43.580062    4874 api_server.go:253] Checking apiserver healthz at https://192.169.0.13:8443/healthz ...
	I0831 15:59:43.583048    4874 api_server.go:279] https://192.169.0.13:8443/healthz returned 200:
	ok
	I0831 15:59:43.583058    4874 status.go:422] multinode-957000 apiserver status = Running (err=<nil>)
	I0831 15:59:43.583067    4874 status.go:257] multinode-957000 status: &{Name:multinode-957000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:59:43.583078    4874 status.go:255] checking status of multinode-957000-m02 ...
	I0831 15:59:43.583319    4874 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:59:43.583339    4874 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:59:43.591835    4874 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52962
	I0831 15:59:43.592182    4874 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:59:43.592514    4874 main.go:141] libmachine: Using API Version  1
	I0831 15:59:43.592533    4874 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:59:43.592773    4874 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:59:43.592884    4874 main.go:141] libmachine: (multinode-957000-m02) Calling .GetState
	I0831 15:59:43.592962    4874 main.go:141] libmachine: (multinode-957000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:59:43.593039    4874 main.go:141] libmachine: (multinode-957000-m02) DBG | hyperkit pid from json: 4597
	I0831 15:59:43.594197    4874 status.go:330] multinode-957000-m02 host status = "Running" (err=<nil>)
	I0831 15:59:43.594208    4874 host.go:66] Checking if "multinode-957000-m02" exists ...
	I0831 15:59:43.594456    4874 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:59:43.594487    4874 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:59:43.602881    4874 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52964
	I0831 15:59:43.603254    4874 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:59:43.603616    4874 main.go:141] libmachine: Using API Version  1
	I0831 15:59:43.603632    4874 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:59:43.603848    4874 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:59:43.603959    4874 main.go:141] libmachine: (multinode-957000-m02) Calling .GetIP
	I0831 15:59:43.604040    4874 host.go:66] Checking if "multinode-957000-m02" exists ...
	I0831 15:59:43.604282    4874 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:59:43.604303    4874 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:59:43.612708    4874 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52966
	I0831 15:59:43.613079    4874 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:59:43.613423    4874 main.go:141] libmachine: Using API Version  1
	I0831 15:59:43.613441    4874 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:59:43.613674    4874 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:59:43.613786    4874 main.go:141] libmachine: (multinode-957000-m02) Calling .DriverName
	I0831 15:59:43.613943    4874 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0831 15:59:43.613954    4874 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHHostname
	I0831 15:59:43.614049    4874 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHPort
	I0831 15:59:43.614130    4874 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHKeyPath
	I0831 15:59:43.614209    4874 main.go:141] libmachine: (multinode-957000-m02) Calling .GetSSHUsername
	I0831 15:59:43.614283    4874 sshutil.go:53] new ssh client: &{IP:192.169.0.14 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18943-957/.minikube/machines/multinode-957000-m02/id_rsa Username:docker}
	I0831 15:59:43.644570    4874 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0831 15:59:43.655525    4874 status.go:257] multinode-957000-m02 status: &{Name:multinode-957000-m02 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I0831 15:59:43.655543    4874 status.go:255] checking status of multinode-957000-m03 ...
	I0831 15:59:43.655799    4874 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0831 15:59:43.655821    4874 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0831 15:59:43.664393    4874 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52969
	I0831 15:59:43.664775    4874 main.go:141] libmachine: () Calling .GetVersion
	I0831 15:59:43.665131    4874 main.go:141] libmachine: Using API Version  1
	I0831 15:59:43.665145    4874 main.go:141] libmachine: () Calling .SetConfigRaw
	I0831 15:59:43.665346    4874 main.go:141] libmachine: () Calling .GetMachineName
	I0831 15:59:43.665465    4874 main.go:141] libmachine: (multinode-957000-m03) Calling .GetState
	I0831 15:59:43.665550    4874 main.go:141] libmachine: (multinode-957000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0831 15:59:43.665622    4874 main.go:141] libmachine: (multinode-957000-m03) DBG | hyperkit pid from json: 4668
	I0831 15:59:43.666777    4874 main.go:141] libmachine: (multinode-957000-m03) DBG | hyperkit pid 4668 missing from process table
	I0831 15:59:43.666820    4874 status.go:330] multinode-957000-m03 host status = "Stopped" (err=<nil>)
	I0831 15:59:43.666831    4874 status.go:343] host is not running, skipping remaining checks
	I0831 15:59:43.666837    4874 status.go:257] multinode-957000-m03 status: &{Name:multinode-957000-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopNode (2.84s)

                                                
                                    
x
+
TestMultiNode/serial/StartAfterStop (41.67s)

                                                
                                                
=== RUN   TestMultiNode/serial/StartAfterStop
multinode_test.go:282: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 node start m03 -v=7 --alsologtostderr
multinode_test.go:282: (dbg) Done: out/minikube-darwin-amd64 -p multinode-957000 node start m03 -v=7 --alsologtostderr: (41.308736947s)
multinode_test.go:290: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-957000 status -v=7 --alsologtostderr
multinode_test.go:306: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiNode/serial/StartAfterStop (41.67s)

                                                
                                    
x
+
TestMultiNode/serial/ValidateNameConflict (43.85s)

                                                
                                                
=== RUN   TestMultiNode/serial/ValidateNameConflict
multinode_test.go:455: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-957000
multinode_test.go:464: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-957000-m03 --driver=hyperkit 
multinode_test.go:464: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p multinode-957000-m03 --driver=hyperkit : exit status 14 (434.499817ms)

                                                
                                                
-- stdout --
	* [multinode-957000-m03] minikube v1.33.1 on Darwin 14.6.1
	  - MINIKUBE_LOCATION=18943
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Profile name 'multinode-957000-m03' is duplicated with machine name 'multinode-957000-m03' in profile 'multinode-957000'
	X Exiting due to MK_USAGE: Profile name should be unique

                                                
                                                
** /stderr **
multinode_test.go:472: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-957000-m04 --driver=hyperkit 
E0831 16:07:52.715467    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
multinode_test.go:472: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-957000-m04 --driver=hyperkit : (37.732607492s)
multinode_test.go:479: (dbg) Run:  out/minikube-darwin-amd64 node add -p multinode-957000
multinode_test.go:479: (dbg) Non-zero exit: out/minikube-darwin-amd64 node add -p multinode-957000: exit status 80 (321.318115ms)

                                                
                                                
-- stdout --
	* Adding node m04 to cluster multinode-957000 as [worker]
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_NODE_ADD: failed to add node: Node multinode-957000-m04 already exists in multinode-957000-m04 profile
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                         │
	│    * If the above advice does not help, please let us know:                                                             │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                           │
	│                                                                                                                         │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                │
	│    * Please also attach the following file to the GitHub issue:                                                         │
	│    * - /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube_node_040ea7097fd6ed71e65be9a474587f81f0ccd21d_0.log    │
	│                                                                                                                         │
	╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
multinode_test.go:484: (dbg) Run:  out/minikube-darwin-amd64 delete -p multinode-957000-m04
multinode_test.go:484: (dbg) Done: out/minikube-darwin-amd64 delete -p multinode-957000-m04: (5.30429097s)
--- PASS: TestMultiNode/serial/ValidateNameConflict (43.85s)

                                                
                                    
x
+
TestPreload (139.59s)

                                                
                                                
=== RUN   TestPreload
preload_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p test-preload-024000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.24.4
E0831 16:09:15.445393    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
preload_test.go:44: (dbg) Done: out/minikube-darwin-amd64 start -p test-preload-024000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.24.4: (1m14.75574445s)
preload_test.go:52: (dbg) Run:  out/minikube-darwin-amd64 -p test-preload-024000 image pull gcr.io/k8s-minikube/busybox
preload_test.go:52: (dbg) Done: out/minikube-darwin-amd64 -p test-preload-024000 image pull gcr.io/k8s-minikube/busybox: (1.477449385s)
preload_test.go:58: (dbg) Run:  out/minikube-darwin-amd64 stop -p test-preload-024000
preload_test.go:58: (dbg) Done: out/minikube-darwin-amd64 stop -p test-preload-024000: (8.442652517s)
preload_test.go:66: (dbg) Run:  out/minikube-darwin-amd64 start -p test-preload-024000 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=hyperkit 
preload_test.go:66: (dbg) Done: out/minikube-darwin-amd64 start -p test-preload-024000 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=hyperkit : (49.517125842s)
preload_test.go:71: (dbg) Run:  out/minikube-darwin-amd64 -p test-preload-024000 image list
helpers_test.go:176: Cleaning up "test-preload-024000" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 delete -p test-preload-024000
helpers_test.go:179: (dbg) Done: out/minikube-darwin-amd64 delete -p test-preload-024000: (5.244395033s)
--- PASS: TestPreload (139.59s)

                                                
                                    
x
+
TestSkaffold (109.83s)

                                                
                                                
=== RUN   TestSkaffold
skaffold_test.go:59: (dbg) Run:  /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/skaffold.exe2516219739 version
skaffold_test.go:59: (dbg) Done: /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/skaffold.exe2516219739 version: (1.70779799s)
skaffold_test.go:63: skaffold version: v2.13.2
skaffold_test.go:66: (dbg) Run:  out/minikube-darwin-amd64 start -p skaffold-008000 --memory=2600 --driver=hyperkit 
E0831 16:12:52.736907    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
skaffold_test.go:66: (dbg) Done: out/minikube-darwin-amd64 start -p skaffold-008000 --memory=2600 --driver=hyperkit : (35.8903512s)
skaffold_test.go:86: copying out/minikube-darwin-amd64 to /Users/jenkins/workspace/out/minikube
skaffold_test.go:105: (dbg) Run:  /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/skaffold.exe2516219739 run --minikube-profile skaffold-008000 --kube-context skaffold-008000 --status-check=true --port-forward=false --interactive=false
E0831 16:13:58.542818    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
skaffold_test.go:105: (dbg) Done: /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/skaffold.exe2516219739 run --minikube-profile skaffold-008000 --kube-context skaffold-008000 --status-check=true --port-forward=false --interactive=false: (54.513961319s)
skaffold_test.go:111: (dbg) TestSkaffold: waiting 1m0s for pods matching "app=leeroy-app" in namespace "default" ...
helpers_test.go:345: "leeroy-app-6d7bc8bf96-29nqc" [e1991b1c-07b5-4ab6-a1e7-940c32dee8bb] Running
skaffold_test.go:111: (dbg) TestSkaffold: app=leeroy-app healthy within 6.006080715s
skaffold_test.go:114: (dbg) TestSkaffold: waiting 1m0s for pods matching "app=leeroy-web" in namespace "default" ...
helpers_test.go:345: "leeroy-web-7fcbc4bd68-rp7tq" [6b8cce73-f865-4e98-94d7-1e71818009c9] Running
E0831 16:14:15.461530    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
skaffold_test.go:114: (dbg) TestSkaffold: app=leeroy-web healthy within 5.004947034s
helpers_test.go:176: Cleaning up "skaffold-008000" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 delete -p skaffold-008000
helpers_test.go:179: (dbg) Done: out/minikube-darwin-amd64 delete -p skaffold-008000: (5.249931153s)
--- PASS: TestSkaffold (109.83s)

                                                
                                    
x
+
TestRunningBinaryUpgrade (87.79s)

                                                
                                                
=== RUN   TestRunningBinaryUpgrade
=== PAUSE TestRunningBinaryUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:120: (dbg) Run:  /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.26.0.320043921 start -p running-upgrade-341000 --memory=2200 --vm-driver=hyperkit 
version_upgrade_test.go:120: (dbg) Done: /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.26.0.320043921 start -p running-upgrade-341000 --memory=2200 --vm-driver=hyperkit : (48.412831569s)
version_upgrade_test.go:130: (dbg) Run:  out/minikube-darwin-amd64 start -p running-upgrade-341000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit 
E0831 16:52:11.747805    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
version_upgrade_test.go:130: (dbg) Done: out/minikube-darwin-amd64 start -p running-upgrade-341000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit : (32.278882183s)
helpers_test.go:176: Cleaning up "running-upgrade-341000" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 delete -p running-upgrade-341000
helpers_test.go:179: (dbg) Done: out/minikube-darwin-amd64 delete -p running-upgrade-341000: (5.385494889s)
--- PASS: TestRunningBinaryUpgrade (87.79s)

                                                
                                    
x
+
TestKubernetesUpgrade (119.61s)

                                                
                                                
=== RUN   TestKubernetesUpgrade
=== PAUSE TestKubernetesUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestKubernetesUpgrade
version_upgrade_test.go:222: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-498000 --memory=2200 --kubernetes-version=v1.20.0 --alsologtostderr -v=1 --driver=hyperkit 
E0831 16:27:52.743830    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
version_upgrade_test.go:222: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-498000 --memory=2200 --kubernetes-version=v1.20.0 --alsologtostderr -v=1 --driver=hyperkit : (53.678792332s)
version_upgrade_test.go:227: (dbg) Run:  out/minikube-darwin-amd64 stop -p kubernetes-upgrade-498000
version_upgrade_test.go:227: (dbg) Done: out/minikube-darwin-amd64 stop -p kubernetes-upgrade-498000: (2.362581929s)
version_upgrade_test.go:232: (dbg) Run:  out/minikube-darwin-amd64 -p kubernetes-upgrade-498000 status --format={{.Host}}
version_upgrade_test.go:232: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p kubernetes-upgrade-498000 status --format={{.Host}}: exit status 7 (68.889896ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
version_upgrade_test.go:234: status error: exit status 7 (may be ok)
version_upgrade_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-498000 --memory=2200 --kubernetes-version=v1.31.0 --alsologtostderr -v=1 --driver=hyperkit 
version_upgrade_test.go:243: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-498000 --memory=2200 --kubernetes-version=v1.31.0 --alsologtostderr -v=1 --driver=hyperkit : (33.478931713s)
version_upgrade_test.go:248: (dbg) Run:  kubectl --context kubernetes-upgrade-498000 version --output=json
version_upgrade_test.go:267: Attempting to downgrade Kubernetes (should fail)
version_upgrade_test.go:269: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-498000 --memory=2200 --kubernetes-version=v1.20.0 --driver=hyperkit 
version_upgrade_test.go:269: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p kubernetes-upgrade-498000 --memory=2200 --kubernetes-version=v1.20.0 --driver=hyperkit : exit status 106 (666.341566ms)

                                                
                                                
-- stdout --
	* [kubernetes-upgrade-498000] minikube v1.33.1 on Darwin 14.6.1
	  - MINIKUBE_LOCATION=18943
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to K8S_DOWNGRADE_UNSUPPORTED: Unable to safely downgrade existing Kubernetes v1.31.0 cluster to v1.20.0
	* Suggestion: 
	
	    1) Recreate the cluster with Kubernetes 1.20.0, by running:
	    
	    minikube delete -p kubernetes-upgrade-498000
	    minikube start -p kubernetes-upgrade-498000 --kubernetes-version=v1.20.0
	    
	    2) Create a second cluster with Kubernetes 1.20.0, by running:
	    
	    minikube start -p kubernetes-upgrade-4980002 --kubernetes-version=v1.20.0
	    
	    3) Use the existing cluster at version Kubernetes 1.31.0, by running:
	    
	    minikube start -p kubernetes-upgrade-498000 --kubernetes-version=v1.31.0
	    

                                                
                                                
** /stderr **
version_upgrade_test.go:273: Attempting restart after unsuccessful downgrade
version_upgrade_test.go:275: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-498000 --memory=2200 --kubernetes-version=v1.31.0 --alsologtostderr -v=1 --driver=hyperkit 
version_upgrade_test.go:275: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-498000 --memory=2200 --kubernetes-version=v1.31.0 --alsologtostderr -v=1 --driver=hyperkit : (24.075339984s)
helpers_test.go:176: Cleaning up "kubernetes-upgrade-498000" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 delete -p kubernetes-upgrade-498000
helpers_test.go:179: (dbg) Done: out/minikube-darwin-amd64 delete -p kubernetes-upgrade-498000: (5.23221792s)
--- PASS: TestKubernetesUpgrade (119.61s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current (3.28s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current
* minikube v1.33.1 on darwin
- MINIKUBE_LOCATION=18943
- KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
- MINIKUBE_BIN=out/minikube-darwin-amd64
- MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
- MINIKUBE_FORCE_SYSTEMD=
- MINIKUBE_HOME=/var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current2872544203/001
* Using the hyperkit driver based on user configuration
* The 'hyperkit' driver requires elevated permissions. The following commands will be executed:

                                                
                                                
$ sudo chown root:wheel /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current2872544203/001/.minikube/bin/docker-machine-driver-hyperkit 
$ sudo chmod u+s /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current2872544203/001/.minikube/bin/docker-machine-driver-hyperkit 

                                                
                                                

                                                
                                                
! Unable to update hyperkit driver: [sudo chown root:wheel /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current2872544203/001/.minikube/bin/docker-machine-driver-hyperkit] requires a password, and --interactive=false
* Starting "minikube" primary control-plane node in "minikube" cluster
* Download complete!
--- PASS: TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current (3.28s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current (6.56s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current
* minikube v1.33.1 on darwin
- MINIKUBE_LOCATION=18943
- KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
- MINIKUBE_BIN=out/minikube-darwin-amd64
- MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
- MINIKUBE_FORCE_SYSTEMD=
- MINIKUBE_HOME=/var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current1826226195/001
* Using the hyperkit driver based on user configuration
* Downloading driver docker-machine-driver-hyperkit:
* The 'hyperkit' driver requires elevated permissions. The following commands will be executed:

                                                
                                                
$ sudo chown root:wheel /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current1826226195/001/.minikube/bin/docker-machine-driver-hyperkit 
$ sudo chmod u+s /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current1826226195/001/.minikube/bin/docker-machine-driver-hyperkit 

                                                
                                                

                                                
                                                
! Unable to update hyperkit driver: [sudo chown root:wheel /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current1826226195/001/.minikube/bin/docker-machine-driver-hyperkit] requires a password, and --interactive=false
* Starting "minikube" primary control-plane node in "minikube" cluster
* Download complete!
--- PASS: TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current (6.56s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Setup (1.67s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Setup
--- PASS: TestStoppedBinaryUpgrade/Setup (1.67s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Upgrade (1339.51s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:183: (dbg) Run:  /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.26.0.39320700 start -p stopped-upgrade-204000 --memory=2200 --vm-driver=hyperkit 
E0831 16:29:08.582762    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:29:15.465728    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:30:38.562449    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:32:52.757191    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:34:08.594897    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:34:15.480231    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:35:31.675475    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:37:52.758079    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:39:08.596944    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:39:15.482532    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:39:15.842250    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
version_upgrade_test.go:183: (dbg) Done: /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.26.0.39320700 start -p stopped-upgrade-204000 --memory=2200 --vm-driver=hyperkit : (11m20.111761583s)
version_upgrade_test.go:192: (dbg) Run:  /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.26.0.39320700 -p stopped-upgrade-204000 stop
version_upgrade_test.go:192: (dbg) Done: /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.26.0.39320700 -p stopped-upgrade-204000 stop: (8.228156168s)
version_upgrade_test.go:198: (dbg) Run:  out/minikube-darwin-amd64 start -p stopped-upgrade-204000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit 
E0831 16:42:52.762216    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:44:08.599725    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:44:15.483851    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:47:18.631960    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:47:52.823092    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:49:08.661667    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:49:15.547006    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
version_upgrade_test.go:198: (dbg) Done: out/minikube-darwin-amd64 start -p stopped-upgrade-204000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit : (10m51.166026672s)
--- PASS: TestStoppedBinaryUpgrade/Upgrade (1339.51s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/MinikubeLogs (2.84s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/MinikubeLogs
version_upgrade_test.go:206: (dbg) Run:  out/minikube-darwin-amd64 logs -p stopped-upgrade-204000
version_upgrade_test.go:206: (dbg) Done: out/minikube-darwin-amd64 logs -p stopped-upgrade-204000: (2.835449041s)
--- PASS: TestStoppedBinaryUpgrade/MinikubeLogs (2.84s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoK8sWithVersion (0.66s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoK8sWithVersion
no_kubernetes_test.go:83: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-868000 --no-kubernetes --kubernetes-version=1.20 --driver=hyperkit 
no_kubernetes_test.go:83: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p NoKubernetes-868000 --no-kubernetes --kubernetes-version=1.20 --driver=hyperkit : exit status 14 (661.568432ms)

                                                
                                                
-- stdout --
	* [NoKubernetes-868000] minikube v1.33.1 on Darwin 14.6.1
	  - MINIKUBE_LOCATION=18943
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18943-957/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18943-957/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to MK_USAGE: cannot specify --kubernetes-version with --no-kubernetes,
	to unset a global config run:
	
	$ minikube config unset kubernetes-version

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/StartNoK8sWithVersion (0.66s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithK8s (41.38s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithK8s
no_kubernetes_test.go:95: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-868000 --driver=hyperkit 
no_kubernetes_test.go:95: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-868000 --driver=hyperkit : (41.147789322s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-darwin-amd64 -p NoKubernetes-868000 status -o json
--- PASS: TestNoKubernetes/serial/StartWithK8s (41.38s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithStopK8s (9.28s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithStopK8s
no_kubernetes_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-868000 --no-kubernetes --driver=hyperkit 
no_kubernetes_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-868000 --no-kubernetes --driver=hyperkit : (6.675598547s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-darwin-amd64 -p NoKubernetes-868000 status -o json
no_kubernetes_test.go:200: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p NoKubernetes-868000 status -o json: exit status 2 (177.725404ms)

                                                
                                                
-- stdout --
	{"Name":"NoKubernetes-868000","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
no_kubernetes_test.go:124: (dbg) Run:  out/minikube-darwin-amd64 delete -p NoKubernetes-868000
no_kubernetes_test.go:124: (dbg) Done: out/minikube-darwin-amd64 delete -p NoKubernetes-868000: (2.430500889s)
--- PASS: TestNoKubernetes/serial/StartWithStopK8s (9.28s)

                                                
                                    
x
+
TestNoKubernetes/serial/Start (78.99s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Start
no_kubernetes_test.go:136: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-868000 --no-kubernetes --driver=hyperkit 
E0831 16:52:52.827983    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/functional-593000/client.crt: no such file or directory" logger="UnhandledError"
no_kubernetes_test.go:136: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-868000 --no-kubernetes --driver=hyperkit : (1m18.993744249s)
--- PASS: TestNoKubernetes/serial/Start (78.99s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunning (0.13s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunning
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-darwin-amd64 ssh -p NoKubernetes-868000 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-darwin-amd64 ssh -p NoKubernetes-868000 "sudo systemctl is-active --quiet service kubelet": exit status 1 (129.606303ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunning (0.13s)

                                                
                                    
x
+
TestNoKubernetes/serial/ProfileList (0.37s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/ProfileList
no_kubernetes_test.go:169: (dbg) Run:  out/minikube-darwin-amd64 profile list
no_kubernetes_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 profile list --output=json
--- PASS: TestNoKubernetes/serial/ProfileList (0.37s)

                                                
                                    
x
+
TestNoKubernetes/serial/Stop (2.36s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Stop
no_kubernetes_test.go:158: (dbg) Run:  out/minikube-darwin-amd64 stop -p NoKubernetes-868000
no_kubernetes_test.go:158: (dbg) Done: out/minikube-darwin-amd64 stop -p NoKubernetes-868000: (2.355131978s)
--- PASS: TestNoKubernetes/serial/Stop (2.36s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoArgs (75.57s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoArgs
no_kubernetes_test.go:191: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-868000 --driver=hyperkit 
E0831 16:54:08.669996    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/skaffold-008000/client.crt: no such file or directory" logger="UnhandledError"
E0831 16:54:15.553092    1483 cert_rotation.go:171] "Unhandled Error" err="key failed with : open /Users/jenkins/minikube-integration/18943-957/.minikube/profiles/addons-540000/client.crt: no such file or directory" logger="UnhandledError"
no_kubernetes_test.go:191: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-868000 --driver=hyperkit : (1m15.566459083s)
--- PASS: TestNoKubernetes/serial/StartNoArgs (75.57s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.12s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunningSecond
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-darwin-amd64 ssh -p NoKubernetes-868000 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-darwin-amd64 ssh -p NoKubernetes-868000 "sudo systemctl is-active --quiet service kubelet": exit status 1 (124.218815ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.12s)

                                                
                                    

Test skip (19/220)

x
+
TestDownloadOnly/v1.20.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/cached-images
aaa_download_only_test.go:129: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.20.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/binaries
aaa_download_only_test.go:151: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.20.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0/cached-images
aaa_download_only_test.go:129: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.31.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0/binaries
aaa_download_only_test.go:151: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.31.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnlyKic (0s)

                                                
                                                
=== RUN   TestDownloadOnlyKic
aaa_download_only_test.go:220: skipping, only for docker or podman driver
--- SKIP: TestDownloadOnlyKic (0.00s)

                                                
                                    
x
+
TestAddons/parallel/Olm (0s)

                                                
                                                
=== RUN   TestAddons/parallel/Olm
=== PAUSE TestAddons/parallel/Olm

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Olm
addons_test.go:500: Skipping OLM addon test until https://github.com/operator-framework/operator-lifecycle-manager/issues/2534 is resolved
--- SKIP: TestAddons/parallel/Olm (0.00s)

                                                
                                    
x
+
TestDockerEnvContainerd (0s)

                                                
                                                
=== RUN   TestDockerEnvContainerd
docker_test.go:170: running with docker false darwin amd64
docker_test.go:172: skipping: TestDockerEnvContainerd can only be run with the containerd runtime on Docker driver
--- SKIP: TestDockerEnvContainerd (0.00s)

                                                
                                    
x
+
TestKVMDriverInstallOrUpdate (0s)

                                                
                                                
=== RUN   TestKVMDriverInstallOrUpdate
driver_install_or_update_test.go:41: Skip if not linux.
--- SKIP: TestKVMDriverInstallOrUpdate (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/PodmanEnv
=== PAUSE TestFunctional/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PodmanEnv
functional_test.go:550: only validate podman env with docker container runtime, currently testing docker
--- SKIP: TestFunctional/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestGvisorAddon (0s)

                                                
                                                
=== RUN   TestGvisorAddon
gvisor_addon_test.go:34: skipping test because --gvisor=false
--- SKIP: TestGvisorAddon (0.00s)

                                                
                                    
x
+
TestImageBuild/serial/validateImageBuildWithBuildEnv (0s)

                                                
                                                
=== RUN   TestImageBuild/serial/validateImageBuildWithBuildEnv
image_test.go:114: skipping due to https://github.com/kubernetes/minikube/issues/12431
--- SKIP: TestImageBuild/serial/validateImageBuildWithBuildEnv (0.00s)

                                                
                                    
x
+
TestKicCustomNetwork (0s)

                                                
                                                
=== RUN   TestKicCustomNetwork
kic_custom_network_test.go:34: only runs with docker driver
--- SKIP: TestKicCustomNetwork (0.00s)

                                                
                                    
x
+
TestKicExistingNetwork (0s)

                                                
                                                
=== RUN   TestKicExistingNetwork
kic_custom_network_test.go:73: only runs with docker driver
--- SKIP: TestKicExistingNetwork (0.00s)

                                                
                                    
x
+
TestKicCustomSubnet (0s)

                                                
                                                
=== RUN   TestKicCustomSubnet
kic_custom_network_test.go:102: only runs with docker/podman driver
--- SKIP: TestKicCustomSubnet (0.00s)

                                                
                                    
x
+
TestKicStaticIP (0s)

                                                
                                                
=== RUN   TestKicStaticIP
kic_custom_network_test.go:123: only run with docker/podman driver
--- SKIP: TestKicStaticIP (0.00s)

                                                
                                    
x
+
TestContainerIPsMultiNetwork (0s)

                                                
                                                
=== RUN   TestContainerIPsMultiNetwork
multinetwork_test.go:43: running with runtime:docker goos:darwin goarch:amd64
multinetwork_test.go:45: skipping: only docker driver supported
--- SKIP: TestContainerIPsMultiNetwork (0.00s)

                                                
                                    
x
+
TestScheduledStopWindows (0s)

                                                
                                                
=== RUN   TestScheduledStopWindows
scheduled_stop_test.go:42: test only runs on windows
--- SKIP: TestScheduledStopWindows (0.00s)

                                                
                                    
x
+
TestInsufficientStorage (0s)

                                                
                                                
=== RUN   TestInsufficientStorage
status_test.go:38: only runs with docker driver
--- SKIP: TestInsufficientStorage (0.00s)

                                                
                                    
x
+
TestMissingContainerUpgrade (0s)

                                                
                                                
=== RUN   TestMissingContainerUpgrade
version_upgrade_test.go:284: This test is only for Docker
--- SKIP: TestMissingContainerUpgrade (0.00s)

                                                
                                    
Copied to clipboard